The Army infantryman is credited with saving the lives of three fellow soldiers.

The Silicon Valley company’s win caps a contentious fight with a Beltway giant.

Latest announcement reveals a disconnect between White House depiction and eyewitness reports.


Analysis also shows schools for the children of U.S. military personnel at home and abroad would be among the most impacted projects.

Adm. Karl Schultz delivered a ‘State of the Coast Guard’ address in Los Angeles.

Syrian forces believe suspects are linked to a suicide bombing in January in the city of Manbij.

Federal judges have ruled against the Trump administration at least 63 times, often agreeing with plaintiffs that agency decision-making is arbitrary and capricious.

They include new dining halls, schools and fire stations on bases, as well as firing ranges, aircraft hangars and flight simulation facilities. 

Beijing’s recent behavior in the Arctic has triggered some alarms in the Pentagon.

President Trump’s former acting attorney general met with Reps. Jerrold Nadler (D-N.Y.) and Douglas A. Collins (R-Ga.) in private after questions were raised about his public testimony last month.

Pentagon memo orders policy to come into effect across the armed forces starting April 12.

Currently prohibited by the Intermediate Range Nuclear Forces Treaty, the missiles will no longer be banned once U.S. withdrawal from the pact is complete this summer.

Disputes over wall funding and budget caps threaten to overshadow a budget request designed to demonstrate a military shift toward great-power competition.

Officials have touted the program as a way to speed up vetting of recruits who have what the Pentagon considers “foreign nexus” risks.

The administration more than doubled its requested money for the Overseas Contingency Operations account, which is excluded from defense budget caps.

The president’s “cost plus 50” formula has struck fear in the hearts of countries that host American troops.

FBI data reviewed by The Post show most people arrested in counterterrorism probes are not charged with terrorism.

Arizona senator, who disclosed she was raped while in the Air Force, broke with other lawmakers by arguing sexual assault cases should remain within chain of command. 

Autocratic nations who learn their citizens enlisted in the U.S. Army could punish recruits or their families with jail time, harsh interrogations or worse, a lawmaker said.

Load More



#####EOF##### Pentagon takes aim at China and Russia in proposed $750 billion budget  - The Washington Post

Pentagon takes aim at China and Russia in proposed $750 billion budget 


Acting defense secretary Patrick Shanahan at the Pentagon on Tuesday. (Carolyn Kaster/AP)

The Pentagon on Tuesday unveiled details of the $750 billion national defense budget that the Trump administration has asked Congress to pass, calling it an example of how the military is shifting its emphasis from counterinsurgencies to competition with China and Russia.

The issue is likely to feature prominently in acting defense secretary Patrick Shanahan’s testimony before Congress, scheduled for Thursday. He has said previously the 2020 budget would be a “masterpiece” demonstrating how the Pentagon is adapting to the great-power strategy.

But President Trump’s plan to take money from the Pentagon budget for the border wall and attempt to raise the defense budget without agreeing to hikes in nonmilitary spending has angered Democrats, setting the stage for negotiations that are more hostile than usual and overshadowing the strategic realignment.

The budget request showed some trade-offs the Pentagon would be expected to make to recalibrate the military bureaucracy toward China and Russia after more than a decade and a half spent focusing on wars in Iraq and Afghanistan.

The Navy plans to retire one of its aircraft carriers early and invest in drone ships; the Army is looking to scale back investments in legacy helicopters and fighting vehicles and instead buy high-end versions; and the Air Force is dramatically increasing its investments in space.

Whether the changes go far enough to reshape the military for a new mission is a matter of debate that will play out in public over the coming months as the Pentagon seeks to reach an agreement with Congress over what proposals will proceed and earn funding.

The $750 billion request comprises $718 billion for the Defense Department and $32 billion for defense-related activities at other agencies, primarily nuclear weapons programs at the Energy Department. The budget represents a nearly 5 percent increase over the current fiscal year but, when adjusted for inflation, falls below overall defense-spending highs during the peak of the wars in Iraq and Afghanistan.

The Pentagon said Tuesday the budget would help ensure peace with Russia and China by building a U.S. military capable of defeating them in a conflict.

“The stakes are clear,” acting deputy defense secretary David L. Norquist said Tuesday at the Pentagon. “If we want peace, adversaries need to know there is no path to victory by fighting us.”

Norquist said the budget represented the largest research, development, test and evaluation request submitted to Congress in 70 years, a testament to the Pentagon’s focus on developing new technologies. He highlighted significant new investments in cyberwarfare, hypersonic weapons, artificial intelligence, lasers and space, including the creation of a Space Force.

Monday’s request fires the starting gun in negotiations with Capitol Hill over what form and size the ultimate defense budget will take when appropriated.

More than anything else, the budget request is a reflection of the administration’s priorities rather than an indication of the actual amount of money that will be spent on individual programs. Because Congress carries the power of the purse in Washington, lawmakers decide how much money to appropriate for the stated priorities.

Top Pentagon officials initially suggested that the defense budget might be cut as a result of the Trump administration’s efforts to control spending in response to a rising deficit, but the rollout made it clear that the White House wants to raise the defense budget but cut nonmilitary discretionary spending.

The administration asked for a 139 percent increase in the Pentagon war-fighting account, which funds active conflicts in Iraq and Afghanistan, because that account does not fall under congressionally mandated budget caps that extend for two more years.

Pentagon officials said Monday that only $67 billion of the $165 billion they requested in that account is actually for funding those conflicts, an acknowledgment that the size of the request — the biggest since 2008 — is simply a way to increase the defense budget while complying with the caps. The officials also recognized that the White House dictated the strategy of inflating the war-fighting budget, known as Overseas Contingency Operations, to achieve the desired defense spending levels overall. 

Democrats have rejected the approach outright.

“I think there are other important priorities in this country, and if we spend all the money on defense, we aren’t going to be able to meet those priorities,” House Armed Services Committee Chairman Adam Smith (D-Wash.) said Tuesday. “Witness the budget the president just sent.”

Much of the attention during the rollout Tuesday fell on Trump’s plan to take billions of dollars from the military budget for construction of a border wall without approval from Congress, using a combination of emergency and counterdrug authorities. 

The Pentagon’s request included $3.6 billion to “backfill” money Trump plans to take for the wall from the Pentagon’s military construction budget this year, as well as an additional $3.6 billion for possible border infrastructure funding during the coming fiscal year.

Deputy Undersecretary of Defense Elaine McCusker said at a briefing that she could not say what programs the money would be going to backfill, because the department has not yet released a list of military construction programs that might lose funding for construction of the wall.

Travis Sharp, research fellow at the Center for Strategic and Budgetary Assessments, said the effort to take money from the Pentagon budget for the wall could jeopardize the military’s plans as the administration and Congress head into what could be an acrimonious budget season.

“DOD’s carefully laid plans have been undercut by the White House’s waffling about the top line last fall, stuffing the budget with money for the border wall and routing funds through the war account in an accounting gimmick,” Sharp said. “The Pentagon is not to blame for those things, but it may still suffer the consequences.”

Still, there are positive indications that this budget will begin the shift toward strategic competition with China and Russia, said Susanna Blume, a Pentagon official during the Obama administration and deputy director of the defense program at the Center for a New American Security.

For example, Blume said, the decision to increase the number of Virginia-class nuclear-powered fast-attack submarines from two to three, invest in unmanned Navy vessels and finalize a decision on how to organize the Space Force represent steps toward implementing the strategy.

The budget request devotes $31 billion to the modernization of the nation’s nuclear triad, aimed primarily at Russia. That includes $3 billion to move into the manufacture and design of the new B-21 bomber, $2.2 billion for the new Columbia-class nuclear submarine, $700 million for a new long-range standoff missile and $600 million to overhaul the intercontinental ballistic missile force.  

Some choices appeared out of step with the strategy. The Pentagon requested less money for the European Deterrence Initiative, the main program that bolsters allies in Europe to deter possible incursions from Russia. The Defense Department requested $5.9 billion, $600 million less than the amount Congress appropriated this year. 

McCusker said the Pentagon spent robustly on the program last year and had already completed one-off infrastructure and repositioning investments, so therefore did not need to spend more on the effort this year. She said the Pentagon was also looking at “increased burden-sharing” for the initiative, meaning more expenditures by European allies. 

The Pentagon also faced questions over a decision to request a mix of older and newer fighter jets, after Trump criticized the F-35 Joint Strike Fighter. The budget calls for 78 F-35s from Lockheed Martin at a cost of $11.2 billion. But it also requested eight new variants of the older F-15 fighter jet made by Boeing, at a cost of $1.1 billion, the first acquisition of an F-15 since 2001.

“Air Force officials said they made the decision in part to meet their current ambitious goals for readiness and capacity,” Sharp said. “But skeptics will see the move as prioritizing the present over the future.”

Despite rolling out an ambitious administration policy that sought a large-scale recalibration of missile defenses, the administration requested $9.43 billion for the Missile Defense Agency, a decrease of $1.06 billion from the enacted 2019 budget.

The MDA requested $157.4 million for defenses against hypersonic threats and $303.5 million for initiatives to demonstrate the capability of sensors to track ballistic missile targets, with the goal of ultimately using the technology in space to track missiles. Both of the programs are aimed at countering emerging capabilities from Russia and China. 

#####EOF##### Military to begin enforcing Trump’s restrictions on transgender troops - The Washington Post

Military to begin enforcing Trump’s restrictions on transgender troops


Democratic lawmakers and civil rights advocates decried the Pentagon’s memo on transgender troops as bigoted. (Charles Dharapak/AP)

The military will begin enforcing President Trump’s restrictions on transgender troops on April 12, according to a Pentagon memo, which drew rebukes from Democratic lawmakers and civil rights advocates who decried the change as bigoted.

The memo stipulates that a history of gender dysphoria would disqualify applicants to the military unless they have been stable in their biological sex for 36 months, are willing to abide by the rules for that sex, and have not transitioned and do not need to in the view of medical providers.

Those who are already in the military or under contract to join before the start date will fall under the 2016 policy enacted by the Obama administration. That policy allowed people who have transitioned to join the military and gave those already serving an opportunity to transition while in the armed forces.

It also allowed service members to change their gender marker in the military system and abide by uniform, grooming and facilities rules for their new identity.

None of that is allowed under the new restrictions.

Under the new policy, secretaries of the military services will be given latitude to grant exceptions to certain individuals, who would then be able to access medical care in accordance with the old policy.

The decision by the Defense Department to begin enforcing the policy comes more than a year and a half after Trump announced the ban by tweet in July 2017, surprising his own defense secretary.

“After consultations with my Generals and military experts, please be advised that the United States Government will not accept or allow Transgender individuals to serve in any capacity in the U.S. Military,” Trump wrote at the time. “Our military must be focused on decisive and overwhelming victory and cannot be burdened with the tremendous medical costs and disruption that transgender in the military would entail. Thank you.” 

The actual Pentagon policy that former defense secretary Jim Mattis formulated and rolled out after the tweet stopped short of a categorical ban, according to defense officials. It allows transgender individuals to serve in the military, as long as they aren’t diagnosed with gender dysphoria, haven’t transitioned sex and don’t need to, and submit to rules for their biological gender regarding uniforms, grooming and facilities.

Critics, however, say those rules amount to a de facto ban, because they essentially allow transgender people to serve in the military only if they refrain from transitioning or engaging in activities that allow them to live out their identity on the job. The policy also bans those who have already transitioned sex from joining outright.

Democrats, who are hoping to reverse the ban through bipartisan legislation, criticized the Pentagon’s decision to begin enforcing the measure.

“Anyone who is qualified and willing should be allowed to serve their country openly,” House Armed Services Committee Chairman Adam Smith (D-Wash.) said in a statement. “Make no mistake, this is a discriminatory ban on transgender people, not a ban on a medical condition, and we will continue to fight against this bigoted policy.”

Civil rights advocates said transgender individuals should be able to serve in the military like everyone else who meets the standards required for the job.

Jennifer Levi, transgender rights project director at GLBTQ Legal Advocates and Defenders, called the Pentagon’s enforcement of the new rules “deeply immoral and deeply insulting to the many transgender troops who are bravely serving their country.”

“Military leaders, medical experts, and the vast majority of the American public agree that our troops deserve gratitude and support, not a slap in the face based on bias and irrational fears,” Levi said.

Defense officials say they have no way of tracking the number of transgender forces in the military, but according to self-identification, they believe 9,000 individuals identify as transgender, about 1,000 of whom have been diagnosed with gender dysphoria.

The rollout of the memo days before acting defense secretary Patrick Shanahan is due to appear before the Senate Armed Services Committee on Thursday adds a level of tension to a testimony already expected to include hostile questions over Trump’s decision to take Pentagon funds without congressional approval for a border wall.

Shanahan, a former Boeing executive who is hoping to get the nod from Trump to replace Mattis, has testified before Congress only once before, during his confirmation hearing to become deputy secretary.

The Pentagon’s decision to press forward with implementing the new restrictions comes after several court cases challenged the measure, leading to injunctions that delayed implementation for months.

But in January, the Supreme Court, in a 5-4 vote along partisan lines, allowed Trump’s broad restrictions to go into effect. The legal battle continued in the lower courts, and lawyers representing the claimants have argued that ongoing proceedings in the U.S. Court of Appeals for the D.C. Circuit should have prevented the Pentagon from moving ahead.

Attorneys for transgender service members in the federal case in Washington said in a filing Wednesday that the government’s “official order directing a change in the government’s policy regarding transgender service members violates the injunction in this case.”

“The government may not depart from the status quo or order any new policy inconsistent with that injunction,” the lawyers said, until they have had time to decide whether to ask the full D.C. Circuit to review the ruling of a three-judge panel.

“Defendants are disregarding both those interests and the authority of this Court. They must not be permitted to cast aside the ordinary procedures that safeguard those constitutional interests, nor may they be permitted to usurp the Court’s jurisdiction to determine when its injunction has expired,” according to the court filing.

Because the Pentagon restrictions do not go into effect until April 12, the court proceedings theoretically could be resolved by then, though it would be a quick turnaround. If they progress past that date, the Pentagon could be forced to further delay implementation.

Transgender-rights activists, meanwhile, have been urging the public to get behind the legislation proposed in Congress in a bid to override Trump’s decision.

“We will continue our fight in the courts until the ban is permanently blocked,” said Shannon Minter, legal director of the National Center for Lesbian Rights. “We also strongly support the bipartisan efforts of congressional leaders to pass urgently needed legislation to protect transgender troops. We urge everyone who cares about the integrity of our military and the well-being of our troops to contact your representatives and tell them to support this legislation.”

#####EOF##### Dallas woman charged with felony after alleged assault by Austin Shuffield - The Washington Post

A black woman was beaten by a white man with a gun. Police charged her with damaging his truck.

On the night when a gun was pulled, punches were thrown and car windows smashed, transforming Dallas into the latest flash point of criminal justice and race relations, everything began with a traffic dispute.

L’Daijohnique Lee was going the wrong way down a one-way street in the Deep Ellum neighborhood on March 21. She was dropping off a friend, she said. Austin Shuffield, 30, was trying to leave a parking lot. He tried to take a picture of her license plate. Lee, 24, threatened to mace him if he didn’t back away, WFAA reported, citing a police affidavit.

A bystander video captured what happened next. Shuffield, who is white, clutches a pistol at his side and steps toward Lee, who is black. She pulls out her phone to dial 911. Shuffield swats the phone to the ground, and Lee connects a punch. Then Shuffield winds up for at least five hard blows to Lee’s head, sending her reeling. Then he kicks her phone down the street.

The video roared across social media and prompted calls for serious charges against Shuffield. But a felony charge landed first for Lee — the assault victim.

Lee was charged Tuesday with felony criminal mischief after allegedly smashing the windows of Shuffield’s truck after the incident. That decision triggered more protests in Dallas, including one planned at city hall Wednesday, the Dallas Morning News reported, as the video and Lee’s story spread.

On Wednesday, the Dallas County district attorney’s office said Lee’s warrant was recalled. “The case has been declined for prosecution,” said Kimberlee Leach, a spokeswoman for the office. It was not immediately clear why.


Austin Shuffield. (Dallas County Sheriff's Department)

The initial felony charge for Lee raised questions about whether it was appropriate for an assault victim.

“She’s obviously in distress. You can’t consider these things outside of context,” said her attorney S. Lee Merritt, who criticized authorities for filing a felony charge against Lee before they focused on Shuffield.

Shuffield has been referred for a felony assault charge to the Dallas County district attorney’s office to consider for a grand jury, police said. He has not been formally charged with any felonies.

“We understand that some people are upset,” Dallas Deputy Police Chief Thomas Castro said in a news conference Tuesday, when he announced the now-dropped charges.

“It’s not the intention of the Dallas Police Department to pick one side or the other. We simply had information that was provided to us on a criminal offense.”

Police first charged Shuffield with public intoxication, interfering with an emergency call and assault — all misdemeanors.

But following a public outcry, an additional charge of unlawful carrying of a weapon was added on March 28. That is also a misdemeanor, Dallas police spokeswoman Sgt. Nicole Watson said. The recommendation for aggravated assault with a deadly weapon was made on the same day.

An attorney for Shuffield, J.R. Cook, declined to comment. Shuffield told detectives days later he feared for his life after Lee allegedly threatened to have friends shoot him, WFAA reported.

Lee also spoke about her fears from the encounter.

“All I could do was try to protect myself. He literally sat there and beat me like a man,” Lee told WFAA soon after the incident.

At the news conference Tuesday, Castro was asked by a reporter if it was typical for crime victims to later be charged for what came after the crime.

He appeared to wince. “Each case is unique. Each case has its own set of circumstances,” he said.

Read more:

Nine died in the nation’s deadliest biker shootout. Texas prosecutors couldn’t convict a single person.

These former agents say the CIA and NSA are censoring them. Now they’re suing.

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read National
Read content from allstate
Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
We went to the source. Here’s what matters to millennials.
A state-by-state look at where Generation Y stands on the big issues.
#####EOF##### Terms of Service - The Washington Post

Terms of Service

Published: July 1, 2014.

These Terms of Service (“Terms”) apply when you use a website, mobile application, or other online service (collectively, the “Services”) that links or refers to the Terms. These terms are a legal contract between you and WP Company LLC (“The Washington Post,” “we” or “us”) so it is important that you review them carefully before using the Services. Your use of the Services indicates that you agree to follow and be bound by the Terms, which include the Discussion and Submission Guidelines. If you do not agree to the Terms, do not access or use the Services.

THESE TERMS CONTAIN DISCLAIMERS OF WARRANTIES (SECTION 11) AND DISCLAIMERS OF LIABILITY (SECTION 12).

1. General

We may change the Terms or modify any features of the Services at any time at our sole discretion. The most current version of the Terms can be viewed by clicking on the “Terms of Service” link at the bottom of the Services’ home page. If you continue to use the Services after changes are posted you will be deemed to have accepted the change.

2. Compliance With Applicable Laws

As a condition of your access to and use of the Services, you agree that you will not use the Services for any purpose that is unlawful or prohibited by these Terms and that you will comply with all applicable laws and any conditions or restrictions imposed by these terms. The Services are offered for your personal and non-commercial use only, and you are prohibited from using, and are expressly not granted the right to use, the Services for any other purpose.

3. Privacy

By using the Services, you indicate that you understand the information collection, use, and disclosure practices described in the Privacy Policy.

4. Discussion And Submission Guidelines

The Services allow you to post content and communicate with others. This content may include text, images, photographs, audio, video, or material in any other form. You represent that you have read and agree to abide by the Discussion and Submission Guidelines, which are incorporated by reference into these Terms, and that by making a submission you are consenting to its display and publication on the Services and in related online and offline promotional materials, in accordance with the guidelines. We may change or modify those guidelines at any time.

By posting content on, to, or through the Services, you give us the right to display such content on the Services and through affiliated publications and to distribute such content and use such content for promotional and marketing purposes, pursuant to the terms of the Discussion and Submission Guidelines. Specifically, you provide us with a royalty-free, irrevocable, perpetual, worldwide, exclusive, and fully sublicensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, incorporate into other works, distribute, perform, display, and otherwise exploit such content, in whole or in part in any form, media or technology now known or later developed.

5. Copyright

The Services (including, but not limited to, text, photographs, graphics, video, audio content, and computer code) are protected by copyright as collective works or compilation under the copyright laws of the United States and other countries. All individual articles, photographs, graphics, video, audio, and other content or elements comprising the Services are also copyrighted works. All copyrights in the Services are owned by us or by our third-party licensors to the extent permitted under the United States Copyright Act and all international copyright laws. Except for content that you have posted on the Services, or unless expressly authorized by The Washington Post in writing, you are prohibited from publishing, reproducing, distributing, publishing, entering into a database, displaying, performing, modifying, creating derivative works, transmitting, or in any way exploiting any part of the Services, except that you may make use of the content for your own personal use as follows: you may make one machine readable copy and/or print copy that is limited to occasional articles of personal interest only. To obtain written consent to use a copyrighted work, please see our Reprints & Permissions section.

Just as The Washington Post requires users to respect our copyrights, and those of our affiliates and partners, we respect the copyrights of others. If you believe in good faith that your copyrighted work has been reproduced on our site without authorization in a way that constitutes copyright infringement, you may notify our designated copyright agent either by mail to Copyright Agent, c/o Legal Department, The Washington Post, 1301 K Street NW, Washington, DC 20071 or to copyrightagent@washpost.com. Please provide our copyright agent with the following information in writing:

•An electronic or physical signature of the person authorized to act on behalf of the owner of the exclusive right that is allegedly infringed;

•Identification of the copyrighted work or a representative list of the works claimed to have been infringed;

•Identification of the allegedly infringing material and information reasonably sufficient to permit us to locate the material;

•Your name, address, telephone number, and email address, so that we may contact you if necessary;

•A statement that you have a good faith belief that the disputed use is not authorized by the copyright owner, its agent, or the law; and

•A statement by you, made under penalty of perjury, that the above information in your notice is accurate and that you are the copyright owner or authorized to act on the copyright owner’s behalf.

6. Trade and Service Mark Rights.

All rights in the product names, company names, trade names, logos, service marks, trade dress, slogans, product packaging, and designs of the Services, whether or not appearing in large print or with the trademark symbol, belong exclusively to The Washington Post or its licensors and are protected from reproduction, imitation, dilution, or confusing or misleading uses under national and international trademark and copyright laws. The use or misuse of these trademarks or any materials, except as permitted herein, is expressly prohibited, and nothing stated or implied on the Services confers on you any license or right under any patent or trademark of The Washington Post, its affiliates, or any third party.

7. Prohibited Conduct

You may not access or use, or attempt to access or use, the Services to take any action that could harm us or any third party, interfere with the operation of the Services, or use the Services in a manner that violates any laws. For example, and without limitation, you may not:

•Post content that is prohibited by or otherwise not in compliance with these Terms (including the Discussion and Submission Guidelines).

•Make use of the contents of the Services in any manner that constitutes an infringement of our rights or the rights of other users or third parties, including copyrights.

•Access parts of the Services to which you are not authorized, or attempt to circumvent any restrictions imposed on your use or access of the Services.

•Copy, reproduce, distribute, publish, enter into a database, display, perform, modify, create derivative works, transmit, or in any way exploit any part of the Services, except for content you have posted on the Services, or unless expressly authorized. You may download material from the Services solely for your own personal use as follows: you may make one machine readable copy and/or one print copy that is limited to occasional articles of personal interest only.

•Distribute any part of the Services over any network, including a local area network, nor sell or offer it for sale. See our Reprints & Permissions section for more information on distribution. In addition, these files may not be used to construct any kind of database.

•Engage in unauthorized “scraping” or spidering, or harvesting of personal information, or use any unauthorized automated means to compile information.

•Take any action that imposes an unreasonable or disproportionately large load on our network or infrastructure.

•Use any device, software, or routine to interfere or attempt to interfere with the proper working of the Services or any activity conducted on the Services.

•Use or attempt to use any engine, software, tool, agent, or other device or mechanism (including, without limitation, browsers, spiders, robots, avatars, or intelligent agents) to navigate or search the Services other than the search engine and search agents available on the Services and other than generally available third-party web browsers.

•Attempt to decipher, decompile, disassemble, or reverse-engineer any of the software comprising or in any way making up a part of the Services

•Engage in any other conduct that restricts or inhibits any other person from using or enjoying the Services.

•Take any action that violates or threatens our system or network security.

Violations of these Terms may result in civil or criminal liability. We may investigate violations of these Terms and we may also work with law enforcement authorities to prosecute users who violate the Terms.

8. Registration and Security

To register for certain Services, you will create login credentials by providing an email address to us and by selecting a username and password. You also provide us certain information during the registration process, which you agree to keep accurate and updated. Each login is for a single user only. You are not allowed to share or disclose your login credentials with any other user or person. We may cancel or suspend your access to the Services if you share your credentials.

You may also sign in to certain Services using your Facebook login information.

You will be responsible for all usage and activity on your account, including use of the account by any third party authorized by you to use your login credentials, and for all charges for any goods or services. You are also responsible for all statements made or materials posted under your account, including liability for harm caused by such statements or materials. You may not transfer, sell, or otherwise assign your rights or obligations under these Terms.

You must be 13 years or older to use the Services. Any fraudulent, abusive, or otherwise illegal activity may be grounds for termination of your account, at our sole discretion, and we may refer you to appropriate law enforcement agencies.

9. Charges for Services

We may charge for access to portions of the Services or to the Services as a whole, and we reserve the right at any time to change the amount we charge for such access or subscriptions that include authorization to access the Services. In such event, we will notify you in advance, and give you an opportunity to subscribe (or un-subscribe) to the Service(s). More information about any such subscriptions can be found on the appropriate locations of the Services. You shall pay all applicable taxes relating to use of the services, and recognize that any fees to third parties that may be required for you to receive the service (such as mobile data plans and text-message charges) are not included in the cost of the services.

10. Third-Party Content and Links to Third-Party Websites

The Services may contain third-party owned content and links to other websites (“Linked Sites”). The Washington Post does not endorse, sponsor, recommend, or otherwise accept responsibility for any Linked Sites. In addition, Linked Sites are not under the control of The Washington Post, and The Washington Post is not responsible for the content or privacy practices of the Linked Sites.

11. Disclaimer of Warranties

THE SERVICES ARE PROVIDED “AS IS.” WE DO NOT WARRANT THAT THE SERVICES WILL BE UNINTERRUPTED OR ERROR-FREE. THERE MAY BE DELAYS, OMISSIONS, INTERRUPTIONS, AND INACCURACIES IN THE NEWS, INFORMATION, OR OTHER MATERIALS AVAILABLE THROUGH THE SERVICES. The Washington Post DISCLAIMS TO THE MAXIMUM EXTENT PERMITTED BY LAW ANY AND ALL SUCH REPRESENTATIONS AND WARRANTIES. IF YOU RELY ON THE SERVICES AND ANY MATERIALS MADE AVAILABLE THROUGH THE SERVICES, YOU DO SO SOLELY AT YOUR OWN RISK

WITHOUT LIMITING THE GENERALITY OF THE FOREGOING, THE WASHINGTON POST DISCLAIMS TO THE MAXIMUM EXTENT PERMITTED BY LAW ANY AND ALL (A) WARRANTIES OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE, (B) WARRANTIES AGAINST INFRINGEMENT OF ANY THIRD-PARTY INTELLECTUAL PROPERTY OR PROPRIETARY RIGHTS, (C) WARRANTIES RELATING TO THE TRANSMISSION OR DELIVERY OF THE SERVICE, (D) WARRANTIES RELATING TO THE ACCURACY, RELIABILITY, CORRECTNESS, TIMELINESS OR COMPLETENESS OF DATA MADE AVAILABLE ON THE SERVICES OR OTHERWISE BY THE WASHINGTON POST, INCLUDING ANY ADVICE, OPINION, STATEMENT, OR OTHER MATERIAL OR DATABASE DISPLAYED, UPLOADED OR DISTRIBUTED IN THE SERVICES OR AVAILABLE THROUGH THE SERVICES, AND WARRANTIES OTHERWISE RELATING TO PERFORMANCE, NONPERFORMANCE, OR OTHER ACTS OR OMISSIONS BY THE WASHINGTON POST OR ANY THIRD PARTY. FURTHER, THERE IS NO WARRANTY THAT THE SERVICES WILL MEET YOUR NEEDS OR REQUIREMENTS OR THE NEEDS OR REQUIREMENTS OF ANY OTHER PERSON.

THE WASHINGTON POST MAKES NO WARRANTIES OR REPRESENTATIONS, EXPRESS OR IMPLIED (A) THAT THE INFORMATION PROVIDED THROUGH THE SERVICES WILL BE FREE FROM ERROR, OMISSION, INTERRUPTION, DEFECT, OR DELAY IN OPERATION, OR FROM TECHNICAL INACCURACIES OR TYPOGRAPHICAL ERRORS, (B) THAT THE SERVICES WILL BE AVAILABLE AT ANY PARTICULAR TIME OR LOCATION (C) THAT DEFECTS OR ERRORS IN THE SERVICES WILL BE CORRECTED, OR (D) THAT THE CONTENT ON THE SERVICES ARE FREE OF VIRUSES OR OTHER HARMFUL COMPONENTS. ANY INFORMATION ON THE SERVICES IS SUBJECT TO CHANGE WITHOUT NOTICE, AND THE WASHINGTON POST DISCLAIMS ALL RESPONSIBILITY FOR THE SERVICES. WE RESERVE THE RIGHT TO CORRECT ANY ERRORS OR OMISSIONS IN THE SERVICES.

12. Limitation of Liability

IN NO EVENT WILL THE WASHINGTON POST OR ITS AFFILIATES OR ANY PARTY INVOLVED IN CREATING, PRODUCING, OR DELIVERING THE SERVICES BE LIABLE FOR ANY DIRECT, INCIDENTAL, CONSEQUENTIAL, INDIRECT, SPECIAL, OR PUNITIVE DAMAGES ARISING OUT OF YOUR ACCESS, USE, MISUSE, OR INABILITY TO USE THE SERVICES OR ANY LINKED SITES, OR IN CONNECTION WITH ANY FAILURE OF PERFORMANCE, ERROR, TRANSMISSION, COMPUTER VIRUS, OR LINE OR SYSTEM FAILURE. THESE LIMITATIONS APPLY WHETHER THE ALLEGED LIABILITY IS BASED ON CONTRACT, TORT, NEGLIGENCE, STRICT LIABILITY, OR ANY OTHER BASIS, EVEN IF THE WASHINGTON POST HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. BECAUSE SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OR LIMITATION OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, THE WASHINGTON POST’S LIABILITY IN SUCH JURISDICTIONS SHALL BE LIMITED TO THE EXTENT PERMITTED BY LAW.

13. Indemnification

You agree to indemnify and hold harmless The Washington Post and its parent, subsidiaries, and affiliates, and their owners, directors, officers, managers, employees, shareholders, agents, and licensors, from and against all losses, expenses, damages and costs, including reasonable attorneys’ fees, resulting from any violation of the Terms, including the Discussion and Submission Guidelines, or the failure to fulfill any obligations relating to your account incurred by you or any other person using your account. We reserve the right to take over the exclusive defense of any claim for which we are entitled to indemnification under this Section. In such event, you shall provide us with such cooperation as is reasonably requested by us.

14. Governing Law

This Agreement shall be governed by the laws of the United States and the District of Columbia. By using the Services, you waive any claims that may arise under the laws of other states, countries, territories or jurisdictions.

15. Termination

The Washington Post may terminate this agreement for any reason at any time. The Washington Post reserves the right, in its sole discretion, to restrict, suspend, or terminate your access to and use of the Service, with or without prior notice. Otherwise applicable sections of the Terms shall survive termination. In addition to any termination rights, we reserve the right to enforce and prosecute any violations of these Terms.

16. Miscellaneous

Supplemental Terms . In connection with your use of the Services, you may be asked to consent to policies or terms and conditions in addition to these Terms. Please read these supplemental policies and terms carefully before making any use of such portions of the Services. Any supplemental terms will not vary or replace these Terms regarding any use of the Services, unless otherwise expressly stated.

No Waiver. The failure of The Washington Post to enforce any provisions of the Terms or to respond to a breach by you or other parties shall not in any way waive its rights to enforce subsequently any terms or conditions of the Terms or to act with respect with similar breaches.

No Partnership . You agree that no joint venture, partnership, employment, or agency relationship exists between you and The Washington Post as a result of these Terms or your access to and use of the Services.

Entire Agreement. Unless otherwise specified herein, the Terms constitute the entire agreement between you and The Washington Post and govern your use of the Services. If any portion of the Terms is held invalid or unenforceable, that portion shall be construed in a manner consistent with applicable law to reflect, as nearly as possible, the original intention of the parties, and the remaining portions shall remain in full force and effect.

#####EOF##### Saudi Arabia’s crown prince must restore dignity to his country — by ending Yemen’s cruel war - The Washington Post

Saudi Arabia’s crown prince must restore dignity to his country — by ending Yemen’s cruel war


Yemeni children take part in a mass funeral in the northern Yemeni city of Saada, Yemen, on Aug. 13. (Stringer/AFP/Getty Images)

Saudi Arabia must face the damage from the past three-plus years of war in Yemen. The conflict has soured the kingdom’s relations with the international community, affected regional security dynamics and harmed its reputation in the Islamic world. Saudi Arabia is in a unique position to simultaneously keep Iran out of Yemen and end the war on favorable terms if it change its role from warmaker to peacemaker. Saudi Arabia could use its clout and leverage within Western circles and empower international institutions and mechanisms to resolve the conflict. However, the window for achieving a resolution to the conflict is rapidly closing.

The-U.N. sponsored Geneva peace talks that were scheduled to open last Thursday have practically collapsed,  in part because Houthis rebels who control the capitol (and most of western Yemen) were afraid their return would be halted due to Saudi Arabia’s control of Yemen’s airspace. The Saudis could provide their enemy and the U.N. officials with travel support — or perhaps they could even offer them a Saudi plane. Even better, Saudi Arabia could announce a cease-fire and offer peace talks in the Saudi Arabian city of Taif, where previous peace talks with Yemenis have taken place.

Saudi Arabia’s actions in Yemen were driven by national security concerns due to Iranian involvement in the country. However, Saudi Arabia’s war efforts have not provided an extra layer of security but have rather increased the likelihood of domestic casualties and damage. Saudi defense systems rely on the U.S.-made Patriot missile system. Saudi Arabia has been successful in preventing Houthi missiles from causing substantial damage. Yet, the inability of Saudi authorities in preventing Houthi missiles from being fired in the first place serves as an embarrassing reminder that the kingdom’s leadership is unable to restrain their Iranian-backed opponent.

Each missile fired by Houthi forces poses both a political and financial burden on the kingdom. The cost of an Iranian missile supplied to the Houthis is uncertain, but one can speculate that each missile does not compare to the cost of a $3 million Patriot missile.

Unexpected costs associated with the conflict in Yemen means Saudi Arabia has increasingly been borrowing funds in international markets without clearly saying what the funds are for. The kingdom has reportedly raised $11 billion in a loan from international banks.

Furthermore, the political costs associated with the loss of innocent life cannot be tabulated. Lapses in Saudi intel led to the deployment of a bomb to target a bus suspected of carrying Houthi forces. Instead, the missile struck a school bus carrying children. The kingdom cannot afford to have an open war zone at its southern border, the confidence of international markets and the moral high ground.

Mistakes and risks associated with long-term conflict diminish Saudi standing internationally and increase the chances of a confrontation with traditional allies. Defense Secretary Jim Mattis recently stated, “We support our partner Saudi Arabia’s right to self-defense.” The Saudi media ran Mattis’s statement and quoted him with great enthusiasm but selectively omitted the portion that stated American support was “not unconditional” and that he urged Saudi authorities to “do everything humanly possible to avoid any innocent loss of life.”

Mattis’s remarks should serve as a reality check to Saudi Crown Prince Mohammed bin Salman. Saudi Arabia is defined and represented by its Islamic stature. We should not need to be reminded of the value of human life. Muslims around the world deserve to see birthplace of Islam represent the ethics of Islam.

Saudi Arabia does not deserve to be compared to Syria, whose leader seemingly did not hesitate to use chemical weapons against his people. But further continuation of the war in Yemen will validate voices saying that Saudi Arabia is doing in Yemen what Syrian President Bashar al-Assad, the Russians and Iranians are doing in Syria. Even the south of Yemen that has been “liberated,” protesters are currently staging a civil disobedience campaign, chanting slogans  against the Saudi-led coalition, which is seen as the actual power on the ground, rather than Yemen’s exiled government.

Peace talks will provide Saudi Arabia with a golden opportunity. Riyadh will almost certainly find international support if it enters into a cease-fire as negotiations take place. It must utilize its global clout and incorporate international institutions and allies to financially pressure Tehran to stand down in Yemen. The Saudi Arabian crown prince must also accept that the Houthis, the Islah (Sunni Islamists) and the southern separatists should play a future role in the governance of Yemen. Obviously, Riyadh will not get all of what it wants and would leave Yemenis to sort out their differences with their fellow Houthis in a National Congress — instead of on bloody battlefields.

The longer this cruel war lasts in Yemen, the more permanent the damage will be. The people of Yemen will be busy fighting poverty, cholera and water scarcity and rebuilding their country. The crown prince must bring an end to the violence and restore the dignity of the birthplace of Islam.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Washington Post: Breaking News, World, US, DC News & Analysis - The Washington Post
“This request is about policy, not politics,” the Ways and Means Committee chairman said. The president has said he does not plan to hand over his tax returns to Congress — and that he would fight it to the Supreme Court, according to two administration officials.
The president’s son-in-law, described in a House committee document as “Official 1,” had “significant disqualifying factors,” according to a White House whistleblower.
A Coast Guard craft patrols a waterway in front of Mar-a-Lago resort. (AP)
A Coast Guard craft patrols a waterway in front of Mar-a-Lago resort. (AP)
An entranceway to the West Palm Beach, Fla., resort. (Getty Images)
An entranceway to the West Palm Beach, Fla., resort. (Getty Images)
A motorcade carrying the president departs Mar-a-Lago. (AP)
A motorcade carrying the president departs Mar-a-Lago. (AP)
The FBI is looking at why a Chinese national illegally gained access to Mar-a-Lago last weekend.
The new accounts emerged weeks before former vice president Joe Biden is expected to announce his decision about a White House bid. They reflected a feeling among some women that he was struggling to understand why his behavior might at times be inappropriate or unwelcome.
Play the latest episode of Post Reports, the premier daily podcast from The Washington Post.
The alleged multimillion-dollar bribery scheme is rare and widely shunned. But that doesn’t necessarily preclude other underhanded tactics, including attempts to sabotage students who are competing for coveted spaces at the most selective schools.
A section of walkway along the Tidal Basin next to the Jefferson Memorial is underwater. (J. Lawler Duggan for The Post)
A section of walkway along the Tidal Basin next to the Jefferson Memorial is underwater. (J. Lawler Duggan for The Post)
A section of walkway along the Tidal Basin is covered by water. (J. Lawler Duggan for The Post)
A section of walkway along the Tidal Basin is covered by water. (J. Lawler Duggan for The Post)
A section of walkway along the Tidal Basin next to the Jefferson Memorial is underwater. (J. Lawler Duggan for The Post)
A section of walkway along the Tidal Basin next to the Jefferson Memorial is underwater. (J. Lawler Duggan for The Post)
Katherine Malone-France of the National Trust for Historic Preservation speaks at the Jefferson Memorial. (J. Lawler Duggan for The Post)
Katherine Malone-France of the National Trust for Historic Preservation speaks at the Jefferson Memorial. (J. Lawler Duggan for The Post)
Advocates say walkways are too narrow for the 36 million annual visitors, forcing them off the paths to tread on the roots of the cherry blossom trees whose beauty they come to celebrate. Meanwhile, the entire basin is slowly sinking.
As the Consumer Product Safety Commission’s acting chairwoman awaits confirmation to continue serving beyond this year, two Democrats seek information on the role of the Trump appointee in agency’s probe into BOB jogging strollers.
The move to ease the confirmation of President Trump’s nominees came after Senate debate exposed raw emotions delivered in highly personal terms.
Investigators found reasonable cause to suspect that the state “routinely violates the constitutional rights of prisoners” and detailed killings, sex abuse and drug use among inmates in a warning letter to state officials.
Pope Francis has selected Gregory, who helped write the first U.S. Catholic guidelines on ending the abuse scandal, to lead an archdiocese tarnished by the American church's sexual abuse crisis.
“I felt that I needed to love this child and keep her safe,” Liz Smith, a nurse in Massachusetts, said of the girl, who had been a ward of the state since she was 3 months old.
  • 16 hours ago
(Allie Caren, Blair Guild/The Washington Post)
How do product safety recalls work?
How do product safety recalls work?
Play Video 2:07
'Sellout, turncoat, clown.' Nats fans have a message for Bryce Harper.
Play Video 2:08
How Trump and his administration talk about Puerto Rico
Play Video 2:52
Introducing Chicken and Waffles cereal, plus other flavors you never asked for
Play Video 4:10
Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
From Our Advertisers
This content is paid for by the advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
#####EOF##### WhatsApp founder plans to leave after broad clashes with parent Facebook - The Washington Post

WhatsApp founder plans to leave after broad clashes with parent Facebook


Co-founders of WhatsApp, a messaging service whose logo is shown above, had clashed with owner Facebook over data privacy and other issues. (Dado Ruvic/Reuters)

The billionaire chief executive of WhatsApp, Jan Koum, is planning to leave the company after clashing with its parent, Facebook, over the popular messaging service’s strategy and Facebook’s attempts to use its personal data and weaken its encryption, according to people familiar with internal discussions.

Koum, who sold WhatsApp to Facebook for more than $19 billion in 2014, also plans to step down from Facebook’s board of directors, according to these people. The date of his departure isn’t known.

It “is time for me to move on,” Koum wrote in a Facebook post after The Washington Post reported his plans to depart. He has been informing senior executives at Facebook and WhatsApp of his decision, and in recent months has been showing up less frequently to WhatsApp’s offices on Facebook’s campus in Silicon Valley, according to the people.

The independence and protection of its users’ data is a core tenet of WhatsApp that Koum and his co-founder, Brian Acton, promised to preserve when they sold their tiny start-up to Facebook. It doubled down on its pledge by adding encryption in 2016. The clash over data took on additional significance in the wake of revelations in March that Facebook had allowed third parties to mishandle its users’ personal information.

Facebook chief executive Mark Zuckerberg replied to Koum’s post by crediting Koum with teaching him “about encryption and its ability to take power from centralized systems and put it back in people’s hands. Those values will always be at the heart of WhatsApp.”

Facebook, though, needs to prove that its investment in WhatsApp — its largest acquisition ever — was worth it.

“Part of Facebook’s success has been to digest acquisitions, successfully monetize them, and integrate them into their advertising machine,” said Daniel Ives, chief strategy officer and head of technology research for research firm GBH Insights. But WhatsApp has been more challenging because of resistance from the founders, he said. “This was a massive culture clash.”

Koum’s exit is highly unusual at Facebook. The inner circle of management, as well as the board of directors, has been fiercely loyal during the scandals that have rocked the social media giant. In addition, Koum is the sole founder of a company acquired by Facebook to serve on its board. Only two other Facebook executives, Zuckerberg and Chief Operating Officer Sheryl Sandberg, are members of the board.

Facebook declined to comment on the reasons for Koum’s departure but didn’t dispute the accounts.

In his Facebook post, Koum said he would take some time off from technology to focus on other pursuits, “such as collecting rare air-cooled Porsches, working on my cars and playing ultimate frisbee.”

Acton left the company in November. He has joined a chorus of former executives critical of Facebook. Acton recently endorsed a #DeleteFacebook social media campaign that has gained force in the wake of the controversy over data privacy sparked by Cambridge Analytica, a political marketing firm tied to the Trump campaign that had inappropriately obtained the private information of 87 million Facebook users.

Though the Cambridge Analytica revelations contributed to a climate of broader frustration with Facebook among WhatsApp employees, Koum made his decision to leave before the scandal, the people said.

WhatsApp, with 1.5 billion monthly users, is the largest messaging service in the world. It is most popular in countries such as India, Egypt and Brazil, as well as in Europe, where it is used for phone calls and text messaging with friends and businesses, as well as news distribution and group chats.

Koum and Acton, former co-workers at Yahoo, founded WhatsApp in 2009. It promised private communications for 99 cents a year. By 2014, the tiny company had almost 500 million users. It caught the attention of Zuckerberg, who was looking to expand the social network overseas. After a dinner at Zuckerberg’s house, Zuckerberg made an offer for WhatsApp that turned Acton and Koum into instant billionaires.

But even in the early days, there were signs of a mismatch. WhatsApp had less than $20 million in revenue at the time of the acquisition. Facebook was making billions of dollars by selling advertisers access to its users, on whom it had collected large amounts of information.

Koum and Acton were openly disparaging of the targeted advertising model. In a WhatsApp blog post in 2012, they wrote that “no one wakes up excited to see more advertising; no one goes to sleep thinking about the ads they’ll see tomorrow.” They described online advertising as “a disruption to aesthetics, an insult to your intelligence, and the interruption of your train of thought.”

The WhatsApp co-founders were also big believers in privacy. They took pains to collect as little data as possible from their users, requiring only phone numbers and putting them at odds with data-hungry Facebook. At the time of the acquisition, Koum and Acton said Facebook had assured them that WhatsApp could remain an independent service and would not share its data with Facebook.

How and if WhatsApp would make money was left an open question. “WhatsApp will remain autonomous and operate independently,” the founders wrote in a blog post announcing the acquisition. “And you can still count on absolutely no ads interrupting your communication.”

Eighteen months later, the promise not to share data evaporated. Facebook pushed WhatsApp to change its terms of service to give the social network access to the phone numbers of WhatsApp users, along with analytics such as what devices and operating systems people were using.

WhatsApp executives were comfortable sharing some data with Facebook to measure who was using the service, according to the people. But they opposed using WhatsApp’s data to create a user profile that was unified across Facebook’s multiple platforms, which also include Instagram and Facebook Messenger, and that could be used for ad-targeting or for Facebook’s data-mining.

Acton and Koum acquiesced, enabling Facebook to recommend that users’ WhatsApp contacts become their Facebook friends and making it possible for Facebook to collect more data about those relationships. The changes also allowed advertisers to feed lists of phone numbers into Facebook’s advertising system, known as Custom Audience, and find new people to target with ads.

Last year, the European Commission, the European Union’s regulatory authority, fined Facebook $122 million for making “misleading” statements when the E.U. approved the WhatsApp takeover.

Conflicts soon arose over how WhatsApp would make money. Facebook scrapped the 99-cent annual charge, and Koum and Acton continued to oppose the advertising model. The service still has no ads, but WhatsApp has embarked on experiments to make money: In January, Facebook rolled out a tool, called WhatsApp Business, to allow businesses to create a profile and send messages to their customers on WhatsApp. The founders also clashed with Facebook over building a mobile payments system on WhatsApp in India.

Another point of disagreement was over WhatsApp’s encryption. In 2016, WhatsApp added end-to-end encryption, a security feature that scrambles people’s messages so that outsiders, including WhatsApp’s owners, can’t read them. Facebook executives wanted to make it easier for businesses to use its tools, and WhatsApp executives believed that doing so would require some weakening of its encryption.

Ultimately, Koum was worn down by the differences in approach, the people said. Other WhatsApp employees are demoralized and plan to leave in November, four years and a month after the Facebook acquisition, when they are allowed to exercise all their stock options under the terms of the Facebook deal, according to the people.

Acton donated $50 million of his money to Signal, a rival messaging app that is geared toward security and privacy. In a recent blog post announcing his donation and role as the executive chairman of the nonprofit Signal Foundation, Acton said his goal was to build “the most trusted communications experience on the planet.”

Julie Tate contributed to this report.

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### What’s driving Silicon Valley to become ‘radicalized’ - The Washington Post

What’s driving Silicon Valley to become ‘radicalized’


Larry Gadea, the chief executive of Envoy, works at his desk in his company's San Francisco office.  (Nick Otto for The Washington Post)

SAN FRANCISCO — Like many Silicon Valley start-ups, Larry Gadea’s company collects heaps of sensitive data from his customers.

Recently, he decided to do something with that data trove that was long considered unthinkable: He is getting rid of it.

The reason? Gadea fears that one day the FBI might do to him what it did to Apple in their recent legal battle: demand that he give the agency access to his encrypted data. Rather than make what he considers a Faustian bargain, he’s building a system that he hopes will avoid the situation entirely.

A guest registration tablet made by Envoy. (Nick Otto for The Washington Post)

“We have to keep as little [information] as possible so that even if the government or some other entity wanted access to it, we’d be able to say that we don’t have it,” said Gadea, founder and chief executive of Envoy. The 30-person company enables businesses to register visitors using iPads instead of handwritten visitor logs. The technology tracks who works at a firm, who visits the firm, and their contact information.

In Silicon Valley, there’s a new emphasis on putting up barriers to government requests for data. The Apple-FBI case and its aftermath have tech firms racing to employ a variety of tools that would place customer information beyond the reach of a government-ordered search.

The trend is a striking reversal of a long-standing article of faith in the data-hungry tech industry, where companies including Google and the latest start-ups have predicated success on the ability to hoover up as much information as possible about consumers.

Now, some large tech firms are increasingly offering services to consumers that rely far less on collecting data. The sea change is even becoming evident among early-stage companies that see holding so much data as more of a liability than an asset, given the risk that cybercriminals or government investigators might come knocking.

Start-ups that once hesitated to invest in security are now repurposing limited resources to build technical systems to shed data, even if it hinders immediate growth.

“Engineers are not inherently anti-government, but they are becoming radicalized, because they believe that the FBI, in particular, and the U.S. government, more broadly, wants to outlaw encryption,” said prominent venture capitalist Marc Andreessen in a recent interview. Andreessen’s firm, Andreessen Horowitz, is an investor in Envoy.

The government abandoned its effort to force Apple to help unlock the iPhone of one of the San Bernardino terrorists and paid professional hackers to crack the phone instead. But experts say that the issue is far from settled, and will probably be the subject of court and legislative battles.

The FBI has found a way into San Bernardino Syed Farook's iPhone, and is now dropping bids to force Apple to help them crack into the phone. See all the latest developments in the case, and why the case isn't over yet. (Jhaan Elker/The Washington Post)

Start-ups are particularly wary, Andreessen said, of legislation proposed recently by Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.) that would compel tech companies to build technical methods to share customers’ encrypted data, at a court’s request.

“They believe there’s this window of opportunity that if we build strong encryption now, we can make it a fait accompli. But if we let five years pass, it may never happen,” Andreessen said.

In the past two years, more companies have embraced encryption, which scrambles information so that it looks like a stream of unintelligible characters to an outsider who accessed it without permission. What’s changed more recently, industry officials say, is that companies are encrypting data and throwing away the key to prevent their gaining access, a move that started with Apple but is spreading across the Valley.

This latter tactic is the most worrisome to law enforcement. Government officials have said repeatedly they do not want to outlaw encryption; FBI Director James B. Comey has called strong encryption a vital means of protecting the public’s personal information from hackers.

But officials insist that there must be a technical means to access that information when companies are served with warrants. Otherwise, there will be “profound consequences for public safety,” Comey told Congress in March. Terrorists and criminals are already using messaging services to which tech companies have thrown away the key, he said. Investigators say two such services, WhatsApp and Telegram, were used by terrorists in the Paris attacks last November.

“This is a Silicon Valley delusion that the government wants to outlaw encryption,” Stewart A. Baker, a former National Security Agency general counsel, said in an interview. “I grant that there is a radicalized subculture of engineers that is very prone to that delusion, but it is a delusion.”

Surely not every company will resort to building such systems. Many simply can’t. Their business relies on targeted advertising or the mining of customer data, and cutting off access would be a recipe for failure. But many start-ups that wouldn't have considered it before the Apple FBI fight are now doing so and discussing the accompanying trade-offs, said Bret Taylor, formerly Facebook’s chief technology officer and now chief executive of the start-up Quip.

The trade-offs can be significant: Heavy encryption risks slowing down your service. It limits the ability to analyze customer behavior or introduce new features. (Encrypting email, for example, would make it harder to search through email.) Once you give customers the only key to their data, you can’t give them a backup if they lose it.

Such efforts over the past few years have been described as part of an arms race between large tech companies and potential invaders, spurred largely by the growing threat of cyberattacks. To some extent, they’ve also been prompted by a newfound wariness of government after Edward Snowden’s revelations about government surveillance, as well as a growing awareness among entrepreneurs of the sheer sensitivity of the data on their services.

Apple led the pack, launching end-to-end encryption with its popular messaging app, iMessage, in 2011. In 2014, the company blocked its own access to information stored on iPhones -- data that disappears permanently after 10 failed passcode attempts. (End-to-end encryption enables only the partners trading messages to decode them. The companies providing the means to transmit them cannot.)

WhatsApp, the global messaging service owned by Facebook, announced end-to-end encryption this year, as did Viber, a messaging app that is popular in Europe. These years-long technical efforts predated the FBI case. Cloudera and Box, two larger tech start-ups selling data storage and processing systems to large corporations, have built encrypted systems over the past year in which only the customer has the keys needed to unscramble data.

The case between Apple and the FBI and the possibility of “backdoor” legislation — mandating encryption bypasses for law enforcement — is a new inflection point. Earlier this month, Google launched Allo, a chat app that allows users to switch on end-to-end encryption, and Amazon chief executive Jeffrey P. Bezos said he was exploring measures to encrypt data and throw away the keys on devices owned by the Seattle-based company.

Stealth Worker — a start-up funded six months ago by the prominent incubator Y-Combinator — provides contract cybersecurity experts to early-stage start-ups, which often operate on a shoestring budget. Stealth Worker chief executive Ken Baylor said that in the past month he had been approached by a half-dozen companies looking for ways to build tougher encryption and other secure technical architectures. But many don’t want to talk about it, he said.

“They are afraid of a phone call from someone high up saying that they are unpatriotic,” Baylor said.

Bracket Computing, a 70-person Silicon Valley start-up, embarked on an encryption project about a month ago intended to make it easier for customers to hold the keys to their own data.

That way, “I can’t get subpoenaed the way Apple did,” Bracket chief executive Tom Gillis said. “This clears up the whole issue: If you have an issue with my customer, go talk to my customer, don’t talk to me. I’m just a tech guy, and I don’t want to be in the middle of these things.”

Gillis said that initially, customers seeking the ability to hold the keys to their data were large, sophisticated financial services companies, such as Goldman Sachs and Blackstone. Today, a broader array of companies, including media and automotive firms and small banks, are making these requests. Advances in Intel’s chips, he said, have made it possible to build these complex systems 13 times as fast as in 2010.

Building systems that cut off a company’s access to customer data is time- and resource-intensive, and these systems don’t come without risks.

Envoy CEO Gadea, an engineering prodigy who was hired by Google when he was just 18, estimates that his company’s data-wiping project will take a few months and about three engineers working full time.

Currently, when a visitor enters a building with an Envoy registration system, a message is sent alerting the appropriate employee that they have a guest. Envoy can send such messages — by text, email or other messaging services — because the customer data is stored on its servers, which are hosted remotely by Amazon Web Services, the cloud division of Amazon. The information is encrypted, but Envoy holds the keys to unscramble it. (Amazon CEO Bezos owns The Washington Post).

Employees of Envoy in their San Francisco office. (Nick Otto for The Washington Post)

Under the new protocol, the engineering team will have to reconfigure the system so that the keys to unscramble the data are kept by the customers on the iPads used to sign people in. Envoy will no longer have the ability to access the keys. The technical challenge will be making it possible for the iPads to alert people when they have visitors, instead of having the alerts come from Envoy’s servers. The goal is to make the change unnoticeable to users, Gadea says, but it could take months to get there.

There will undoubtedly be many trade-offs, Gadea said. Not only will Envoy sacrifice the ability to send visitor notifications directly, but customer service also could be become more challenging. Today, if one of Envoy’s 2,000 customers asks for help correcting a mistake in a visitor name or resetting a password, an Envoy customer service rep can lend a hand. Under the new system Envoy’s reps could have their hands tied.

The new system could also make it harder to fix software errors because Envoy will no longer be able to push out automatic updates from its servers. And if a customer loses its passwords or keys, Envoy won’t have the ability to restore the lost data. It will be inaccessible forever.

Gadea said he is not anti-government and would sell Envoy’s services to the FBI if the agency wished to become a customer. “It’s like with your friends,” he said, “you’re always going to find one thing you don’t like about them. But you’re not going to hate a person because of one disagreement.”

And he said he understands the trade-offs.

““For a small startup trying to iterate quickly, it definitely slows things down,” Gadea said. “But in the long run, it’s a competitive advantage and it reduces risk on our company. I can sleep better at night.”

Staff writer Ellen Nakashima contributed to this report


Employees at Envoy (Nick Otto for The Washington Post)
Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### Current, former Pentagon leaders sound alarm on Chinese technology in 5G networks - The Washington Post

Current, former Pentagon leaders sound alarm on Chinese technology in 5G networks


A display for Huawei’s 5G wireless technology in Beijing last year. (Mark Schiefelbein/AP)

Current and former Pentagon leaders are warning about the risks to future military operations posed by allies in Europe and Asia using Chinese technology in their 5G wireless telecommunications networks.

In a statement Wednesday, six former officials note that the immense bandwidth and super-high speeds of the coming 5G systems — up to 100 times faster than current 4G platforms — will make them attractive for the U.S. military to share data with allies or transfer information in combat.

And they and U.S. defense officials warn that allowing Chinese firms such as Huawei to outfit these networks poses unacceptable risks of espionage and disruptive cyberattacks on military operations because of the firm’s alleged ties to the Chinese government and a 2017 Chinese law that requires companies, if directed, to cooperate in surveillance activities.

“While our concern is for future operations, the time for action is now,” said the leaders, who include retired Adm. James Stavridis and retired Gen. Philip Breedlove, the two most recent commanders of NATO and U.S. European Command; retired Adm. Samuel Locklear III, former head of U.S. Pacific Command; and a former director of national intelligence, retired Lt. Gen. James R. Clapper Jr.

Their blunt statement — the first by so many former senior commanders — was timed to the Wednesday opening of a NATO summit of foreign ministers in Washington.

“As military leaders who have commanded U.S. and allied troops around the world, we have grave concerns about a future where a Chinese-developed 5G network is widely adopted among our allies and partners,’’ they said.

Pentagon leaders also are sounding the alarm. Last week, Ellen Lord, undersecretary of defense for acquisition and sustainment, said at an Atlantic Council conference that China is engaged in a struggle with the United States for “digital supremacy” economically and militarily.

She called for an integrated U.S. government strategy in partnership with Silicon Valley and the investment community to counter China’s ambitions. She warned that if other partners use equipment from a vendor such as Huawei, “we could be overcome quickly with technical overmatch,” which could diminish battlefield advantage.

“If our allies and partners go with a Huawei solution, we need to reconsider how we share critical information with them,” Lord said.

Last week, Joint Chiefs of Staff Chairman Joseph F. Dunford Jr. forecast a “broad, fundamental” threat to national security if Huawei is permitted to build allies’ networks. “A foundational element of an alliance is the ability to share information securely,” he told the House Armed Services Committee.

Last month, acting defense secretary Patrick Shanahan noted a number of Chinese government initiatives that seek to enhance Chinese influence globally, often through questionable means, he said. “With initiatives like the Digital Silk Road, Made in China 2025 and Thousand Talents Program in play, which spur companies and individuals to carry out its bidding, China aims to steal its way to a China-controlled global technological infrastructure, including a 5G network,” he told the Senate Armed Services Committee.

“Let me be perfectly clear,” he said, “the United States does not oppose competition, as long as it takes place on a fair and level playing field. However, we cannot accept the unfair and illegal actions of others who intend to tilt the playing field through predatory economics and underhanded tactics.”

There is no U.S. supplier of end-to-end 5G network components. The major U.S. telecom companies, which have pledged to exclude Huawei and another Chinese firm, ZTE, from their 5G systems, rely on European providers Ericsson and Nokia and the South Korean firm Samsung. But Huawei, the world’s largest telecom equipment provider, is making a big play for business with rural carriers in the United States and with countries in Europe and the developing world.

U.S. law bars the government and its contractors from purchasing gear made by Huawei and ZTE. Legislation is pending in Congress that would prevent U.S. companies from supplying Huawei and other Chinese telecommunications companies with critical components. The White House has had ready for months an executive order that would effectively ban Chinese companies from the U.S. telecom supply chain.

Countries such as Poland, Estonia and Germany in Europe and Indonesia, Singapore, and the Philippines in Asia are weighing whether to include Huawei in their next-generation systems. The United States has mounted a campaign to persuade partners that using Huawei poses unacceptable security risks.

The former U.S. commanders said their concerns fall into three categories: espionage, military operations and human rights. Noting the 2017 Chinese law, they alleged that Huawei’s provision of radio antennas and other communications gear could provide the Chinese government with a means to capture data “at will.”

Huawei’s founder, Ren Zhengfei, insists his company has never enabled Chinese government espionage and it doesn’t plan to. But U.S. officials are skeptical that the firm would resist a government directive.

The Pentagon is weighing how it might use the future 5G networks to share intelligence or conduct military operations. Former officials say the shortcomings of current satellite communications, which are vulnerable to Chinese and Russian jamming, make 5G wireless systems a logical alternative. But if they use untrustworthy equipment, these officials say, the data could be stolen or manipulated, or operations could be disrupted.

“You’re trying to target an adversary’s capability,’’ said retired Rear Adm. Mark Montgomery, a former Pacific Command director of operations and a former policy director of the Senate Armed Services Committee. “The data needs to be accurate and near real time and reliable. A 5G network is going to be highly desirable — unless it’s built by Huawei.”

The former officials also said they are concerned that the export of China’s 5G technologies “will advance a pernicious high-tech authoritarianism.”

If Huawei is invited by foreign governments to build their new networks, Beijing could have access to the data of billions of people, they allege. China already leads the world in the deployment of facial and gait recognition in settings ranging from airports to classrooms, and it has created what is probably the world’s largest censorship apparatus to monitor the private messages of 1 billion users on the WeChat app.

In the western Chinese region of Xinjiang, authorities have relied on the sweeping collection of electronic communications to support a detention and internment program that has ensnared more than 1 million Muslim citizens.

The former officials say they fear unbridled data-gathering coupled with information gleaned from 5G networks could give Beijing “unprecedented powers of foreign influence to favor authoritarian allies . . . and punish human-rights activists the world over.”

The other former officials signing the statement are retired Admiral Timothy Keating, former head of Pacific Command, and retired Army Gen. Keith Alexander, former director of the National Security Agency.

Gerry Shih in Beijing contributed to this report.

#####EOF##### Subscribe to The Washington Post
#####EOF##### Transcript of Zuckerberg’s appearance before House committee - The Washington Post

Transcript of Zuckerberg’s appearance before House committee

Facebook chief executive Mark Zuckerberg appeared before the House Energy and Commerce Committee Wednesday for his second day of questioning on the Hill. Below is a partial transcript of the hearing.

REP. GREG WALDEN (R-ORE.): Okay. I'd ask our guests to please take their seats so we can get started. The Committee on Energy And Commerce will now come to order.

WALDEN: Before my opening statement, just as a reminder to our committee members on both sides, it's another busy day at Energy and Commerce. In addition, as you will recall, to this morning's Facebook hearing, later today, our Health Subcommittee will hold its third in the series of legislative hearings on solutions to combat the opioid crisis.

And, remember, Oversight and Investigations Subcommittee will hold a hearing where we will get an update on the restoration of Puerto Rico's electric infrastructure following last year's hurricane season.

So, just a reminder: When this hearing concludes, I think we have votes on the House floor. Our intent is to get through every — every member before that point, to be able to ask questions. But then, after the votes, we will come back into our subcommittees to do that work. As Ray Baum used to say, “The fun never stops.”

The chair now recognizes himself for five minutes for purposes of an opening statement.

Good morning. Welcome, Mr. Zuckerberg, to the Energy and Commerce Committee in the House. We've called you here today for two reasons. One is to examine the alarming reports regarding breaches of trust between your company, one of the biggest and most powerful in the world, and its users. And the second reason is to widen our lens to larger questions about the fundamental relationship tech companies have with their users.

The incident involving Cambridge Analytica and the compromised personal information of approximately 87 million American users — or mostly American users — is deeply disturbing to this committee. The American people are concerned about how Facebook protects and profits from its users' data.

In short, does Facebook keep its end of the agreement with its users? How should we, as policymakers, evaluate and respond to these events? Does Congress need to clarify whether or not consumers own or have any real power over their online data? Have edge providers grown to the point that they need federal supervision?

You and your co-founders started a company in your dorm room that's grown to one — be one of the biggest and most successful businesses in the entire world.

Through innovation and quintessentially American entrepreneurial spirit, Facebook and the tech companies that have flourished in Silicon Valley join the legacy of great American companies who built our nation, drove our economy forward, and created jobs and opportunity. And you did it all without having to ask permission from the federal government and with very little regulatory involvement.

The company you created disrupted entire industries and has become an integral part of our daily lives. Your success story is an American success story, embodying our shared values of freedom of speech, freedom of association and freedom of enterprise.

Facebook also provides jobs for thousands of Americans, including my own congressional district, with data centers in Prineville. Many of our constituents feel a genuine sense of pride and gratitude for what you've created, and you're rightly considered one of the era's greatest entrepreneurs.

This unparalleled achievement is why we look to you with a special sense of obligation and hope for deep introspection. While Facebook has certainly grown, I worry it may not have matured. I think it's time to ask whether Facebook may have moved too fast and broken too many things.

There are critical unanswered questions surrounding Facebook's business model and the entire digital ecosystem regarding online privacy and consumer protection. What exactly is Facebook? Social platform? Data company? Advertising company? A media company? A common carrier in the information age? All of the above? Or something else?

WALDEN: Users trust Facebook with a great deal of information; their name, home town, email, phone number, photos, private messages, and much, much more. But, in many instances, users are not purposefully providing Facebook with data. Facebook collects this information while users simply browse other websites, shop online or use a third-party app.

People are willing to share quite a bit about their lives online, based on the belief they can easily navigate and control privacy settings and trust that their personal information is in good hands. If a company fails to keep its promises about how personal data are being used, that breach of trust must have consequences.

Today we hope to shed light on Facebook's policies and practices surrounding third-party access to and use of user data. We also hope you can help clear up the considerable confusion that exists about how people's Facebook data are used outside of the platform.

We hope you can help Congress, but, more importantly, the American people better understand how Facebook user information has been accessed by third parties, from Cambridge Analytica and Cubeyou, to the Obama for America presidential campaign.

And we ask that you share any suggestions you have for ways policymakers can help reassure our constituents that data they believe was only shared with friends or certain groups remains private to those circles.

As policymakers, we want to be sure that consumers are adequately informed about how their online activities and information are used. These issues apply not just to Facebook, but equally to the other internet-based companies that collect information about users online.

So, Mr. Zuckerberg, your expertise in this field is without rival. So thank you for joining us today to help us learn more about these vital matters and to answer our questions.

With that, I yield now to the gentleman from New Jersey, the ranking member of the Energy and Commerce Committee, my friend, Mr. Pallone, for five minutes for purposes of an opening statement.

REP. FRANK PALLONE JR. (D-N.J.): Thank you, Mr. Chairman. And I also want to thank you Mr. Zuckerberg for being here today.

Facebook has become integral to our lives. We don't just share pictures of our families, we use it to connect for school, to organize events and to watch baseball games.

Facebook has enabled everyday people to spur national political movements. Most of us in Congress use Facebook to reach our constituents in ways that were unimaginable 10 years ago, and this is certainly a good thing.

But it also means that many of us can't give it up easily. Many businesses have their only web presence on Facebook, and, for professions like journalism, people's jobs depend on posting on the site.

And this ubiquity comes with a price; for all the good it brings, Facebook can be a weapon for those, like Russia and Cambridge Analytica, that seek to harm us and hack our democracy.

Facebook made it too easy for a single person — in this instance, Aleksandr Kogan — to get extensive personal information about 87 million people. He sold this data — Cambridge Analytical [sic] — who used it to try to sway the 2016 presidential election for the Trump campaign.

And Facebook made itself a powerful tool for things like voter suppression, in part by opening its platform to app developers with little or no oversight.

But it gets worse. The fact is no one knows how many people have access to the Cambridge Analytical [sic] data, and no one knows how many other Cambridge Analyticas are still out there.

Shutting down access to data to third parties isn't enough, in my opinion. Facebook and many other companies are doing the same thing: They're using people's personal information to do highly targeted product and political advertising.

And Facebook is just the latest in a never-ending string of companies that vacuum up our data, but fail to keep it safe. And this incident demonstrates yet again that our laws are not working.

Making matters worse, Republicans here in Congress continue to block or even repeal the few privacy protections we have. In this era of nonstop data breaches, last year, Republicans eliminated existing privacy and data security protections at the FCC.

PALLONE: And their justification that those protections were not needed because the Federal Trade Commission has everything under control — well, this latest disaster shows just how wrong the Republicans are.

The FTC used every tool Republicans have been willing to give it, and those tools weren't enough. And that's why Facebook acted like so many other companies, and reacted only when it got bad press.

We all know this cycle by now. Our data is stolen. The company looks the other way. Eventually, reporters find out, publish a negative story, and the company apologizes. And Congress then holds a hearing, and then nothing happens.

By not doing its job, this Republican-controlled Congress has become complicit in this nonstop cycle of privacy by press release. And this cycle must stop, because the current system is broken.

So I was happy to hear that Mr. Zuckerberg conceded that his industry needs to be regulated, and I agree. We need comprehensive privacy and data security legislation.

We need baseline protections that stretch from Internet service providers, to data brokers, to app developers and to anyone else who makes a living off our data. We need to figure out how to make sure these companies act responsibly, even before the press finds out.

But, while securing our privacy is necessary, it's not sufficient. We need to take steps immediately to secure our democracy. We can't let what happened in 2016 happen again.

And, to do that, we need to learn how Facebook was caught so flat-footed in 2016. How was it so blind to what the Russians and others were doing on its systems? Red flags were everywhere. Why didn't anyone see them? Or were they ignored?

So today's hearing is a good start. But we also need to hold additional hearings where we hold accountable executives from other tech companies, Internet service providers, data brokers and anyone else that collects our information.

Now, Congresswoman Schakowsky from Illinois and I introduced a bill last year that would require companies to implement baseline data security standards. And I plan to work with my colleagues to draft additional legislation.

But I have to, say Mr. Chairman, it's time for this committee and this Congress to pass comprehensive legislation to prevent incidents like this in the future.

My great fear is that we have this hearing today, there's a lot of press attention — and, Mr. Zuckerberg, you know, appreciate your being here once again — but, if all we do is have a hearing and then nothing happens, then that's not accomplishing anything.

And — and I — you know, I know I sound very critical of the Republicans and their leadership on this — on these privacy issues. But I've just seen it — I've just seen it over and over again — that we have the hearings, and nothing happens. So excuse me for being so pessimistic, Mr. Chairman, but that's where I am.

I yield back.

WALDEN: I think I thank the gentleman for his opening comments.

(LAUGHTER)

With that, we now conclude with member opening statements. The chair would like to remind members that, pursuant to the committee rules, all members' opening statements will be made part of the record.

Today, we have Mr. Mark Zuckerberg, Chairman and CEO of Facebook Incorporated, here to testify before the full Energy and Commerce Committee. Mr. Zuckerberg will have the opportunity to give a five-minute opening statement, followed by a round of questioning from our members.

So thank you for taking the time to be here, and you are now recognized for five minutes.

ZUCKERBERG: Thank you.

Chairman Walden, Ranking Member Pallone and members of the committee, we face a number of important issues around privacy, security and democracy. And you will rightfully have some hard questions for me to answer.

Before I talk about the steps we're taking to address them, I want to talk for a minute about how we got there. Facebook is an idealistic and optimistic company. For most of our existence, we focused on all the good that connecting people can bring.

And, as Facebook has grown, people everywhere have gotten a powerful new tool for staying connected to the people they care about most, for making their voices heard and for building community and businesses.

Just recently, we've seen the “Me Too” movement and the March for Our Lives organized, at least part, on Facebook. After Hurricane Harvey, people came together and raised more than $20 million for relief. And there are more than 70 million small businesses around the world that use our tools to grow and create jobs.

ZUCKERBERG: But it's clear now that we didn't do enough to prevent these tools from being used for harm, as well. And that goes for fake news, foreign interference in elections and hate speech, as well as developers and data privacy. We didn't take a broad enough view of our responsibility, and that was a big mistake.

It was my mistake, and I am sorry. I started Facebook, I run it, and, at the end of the day, I am responsible for what happens here. So, now, we have to go through every part of our relationship with people to make sure that we're taking a broad enough view of our responsibility.

It's not enough to just connect people. We have to make sure those connections are positive. It's not enough to just give people a voice. We need to make sure that voice isn't used to harm other people or spread misinformation. And it's not enough to just give people control of their information. We need to make sure that the developers that they share it with protect their information too.

Across the board, we have a responsibility to not just give people tools, but to make sure that those tools are used for good.

It's going to take some time to work through all the changes we need to make. But I am committed to getting this right, and that includes the basic responsibility of protecting people's information, which we failed to do with Cambridge Analytica.

So here are a few key things that we're doing to address this situation and make sure that this doesn't happen again.

First, we're getting to the bottom of exactly what Cambridge Analytica did, and telling everyone who may have been affected. What we know now is that Cambridge Analytica improperly obtained some information about millions of Facebook members by buying it from an app developer that people had shared it with.

This information was generally information that people share publicly on their profile pages, like their name and profile picture and the list of pages that they follow. When we first contacted Cambridge Analytica, they told us that they had deleted the data. And then, about a month ago, we heard a new report that suggested that this was not true.

So now we're working with governments in the U.S., the U.K. and around the world to do a full audit of what they've done and to make sure that they get rid of any data that they still have.

Second, to make sure that no other app developers are out there misusing data, we're now investigating every single app that had access to a large amount of people's information on Facebook in the past. And, if we find someone that improperly used data, we're going to ban them from our platform and tell everyone affected.

Third, to prevent this from ever happening again, we're making sure developers can't access as much information, going forward. The good news here is that we made some big changes to our platform in 2014 that would prevent this specific instance with Cambridge Analytica from happening again today.

But there's more to do, and you can find more of the details of the other steps we're taking in the written statement I provided.

My top priority has always been our social mission of connecting people, building community and bringing the world closer together. Advertisers and developers will never take priority over that for as long as I am running Facebook.

I started Facebook when I was in college. We've come a long way since then. We now serve more than 2 billion people around the world, and, every day, people use our services to stay connected with the people that matter to them most.

I believe deeply in what we're doing, and I know that, when we address these challenges, we'll look back and view helping people connect and giving more people a voice as a positive force in the world.

I realize the issues we're talking about today aren't just issues for Facebook and our community; they're challenges for all of us as Americans. Thank you for having me here today, and I am ready to take your questions.

WALDEN: Thank you, Mr. Zuckerberg.

I'll start out, and we'll go into the questioning phase. We'll go back and forth, as we always do. Remember, it's four minutes today, so we can get to everyone.

Mr. Zuckerberg, you've described Facebook as a company that connects people and as a company that's idealistic and optimistic. I have a few questions about what other types of companies Facebook may be.

Facebook has created its own video series, starring Tom Brady, that ran for six episodes and has over 50 million views. That's twice the number of the viewers that watched the Oscars last month. Also, Facebook's obtained exclusive broadcasting rights for 25 major league baseball games this season.

Is Facebook a media company?

ZUCKERBERG: Thank you, Mr. Chairman.

I consider us to be a technology company, because the primary thing that we do is have engineers who write code and build products and services for other people.

There are certainly other things that we do, too. We — we do pay to help produce content. We build enterprise software, although I don't consider us an enterprise software company. We build planes to help connect people, and I don't consider ourselves to be an aerospace company.

But, overall, when people ask us if we're a media company, what — what I hear is, “Do we have a responsibility for the content that people share on Facebook?” And I believe the answer to that question is yes.

WALDEN: All right, let me ask the next one. You can send money to friends on Facebook Messenger using a debit card or a PayPal account to, quote, “split meals, pay rent and more,” close quote. People can also send money via Venmo or their bank app.

Is Facebook a financial institution?

ZUCKERBERG: Mr. Chairman, I do not consider ourselves to be a financial institution, although you're right that we do provide tools for people to send money.

WALDEN: So you've mentioned several times that you started Facebook in your dorm room in 2004; 15 years, 2 billion users and several — unfortunately — breaches of trust later, Facebook's today — is Facebook today the same kind of company you started with a Harvard.edu email address?

ZUCKERBERG: Well, Mr. Chairman, I think we've evolved quite a bit as a company. When I started it, I certainly didn't think that we would be the ones building this broad of a community around the world. I thought someone would do it. I didn't think it was going to be us. So we've definitely grown.

WALDEN: And — and you've recently said that you and Facebook have not done a good job of explaining what Facebook does. And so, back in 2012 and 2013, when a lot of this scraping of user and friend data was happening, did it ever cross your mind that you should be communicating more clearly with users about how Facebook is monetizing their data?

I understand that Facebook does not sell user data, per se, in the traditional sense, but it's also just as true that Facebook's user data is probably the most valuable thing about Facebook. In fact, it may be the only truly valuable thing about Facebook.

Why wasn't explaining what Facebook does with users' data a higher priority for you as a co-founder and — and now as CEO?

ZUCKERBERG: Mr. Chairman, you're right that we don't sell any data. And I would say that we do try to explain what we do as — as time goes on. It's a — it's a broad system.

You know, every day, about 100 billion times a day, people come to one of our products, whether it's Facebook or Messenger or Instagram or WhatsApp, to put in a piece of content, whether it's a — a photo that they want to share or a message they want to send someone.

And, every time, there's a control right there about who you want to share it with. Do you want to share it publicly, to broadcast it out to everyone? Do you want to share it with your friends, a specific group of people? Do you want to message it to just one — one person or a couple of people? That's the most important thing that we do. And I think that, in the product, that's quite clear.

I do think that we can do a better job of explaining how advertising works. There is a common misperception, as you say, that is just reported — often keeps on being reported, that, for some reason, we sell data.

ZUCKERBERG: I can't be clearer on this topic: We don't sell data. That's not how advertising works, and I do think we could probably be doing a clearer job explaining that, given the misperceptions that are out there.

WALDEN: Given the situation, are — can you manage the issues that are before you? Or does Congress need to intercede? I'm going to leave that, because I'm out — I'm over my time — that and I want an issue the Vietnam Veterans of America have raised, too. And we'll get back with your staff on that about some fake pages that are up.

But I want to stay on schedule, so, with that, I'll yield to Mr. Pallone for four minutes.

PALLONE: Thank you.

I — Mr. Zuckerberg, you talk about how positive and optimistic you are, and I'm — I guess I'm sorry, because I'm not. I don't have much faith in corporate America, and I certainly don't have much faith in their GOP allies here in Congress.

I really look at everything in — that this committee does, or most of what this committee does, in terms of the right to know. In other words, they — I always fear that people, you know, that go on Facebook — they don't necessarily know what's happening or what's going on with their data.

And so, to the extent that we could pass legislation, which I think we need — and you said that we probably should have some legislation — I want that legislation to give people the right to know, to empower them, to — to, you know, provide more transparency, I guess, is the best way to put. So I'm looking at everything through that sort of lens.

So just let me ask you three quick questions. And I'm going to ask you to answer yes or no, because of the time. Yes or no: Is Facebook limiting the amount or type of data Facebook itself collects or uses?

ZUCKERBERG: Congressman, yes. We limit a lot of the data that we collect and use.

PALLONE: But, see, I — I don't see that in the announcements you've made. Like, you've made all these announcements the last few days about the changes you're going to make. And I don't really see how that — how those announcements or changes limit the amount or type of data that Facebook collects or uses in an effective way.

But let me go to the second one. Again, this is my concern — that users currently may not know or take affirmative action to protect their own privacy. Yes or no: Is Facebook changing any user default settings to be more privacy-protective?

ZUCKERBERG: Congressman, yes. In — in response to these issues, we've changed a lot of the way that our platform works, so, that way, developers can't get access to as much information.

PALLONE: But see, again, I don't see that in — in the changes you — that you propose. I don't really see any way that these user default settings — you're changing these user default settings in a way that is going to be more privacy protection. But let me — protective.

But let me go to the third one. Yes or no: Will you commit to changing all user default settings to minimize, to the greatest extent possible, the collection and user — and use of users' data? Can you make that commitment?

ZUCKERBERG: Congressman, we try to collect and — and give people the ability ...

(CROSSTALK)

PALLONE: But I'd like you to answer yes or no, if you could. Will you make the commitment to change all the user — to changing all the user default settings to minimize, to the greatest extent possible, the collection and use of users' data?

That's — I don't think that's hard for you to say yes to, unless I'm missing something.

ZUCKERBERG: Congressman, this is a complex issue that I think is — deserves more than a one-word answer.

PALLONE: Well, again, that's disappointing to me, because I think you should make that commitment. And maybe what we could do is follow up with you on this, if possible — if that's okay. We can do that follow-up?

ZUCKERBERG: Yes.

PALLONE: All right.

Now, you said yesterday that each of us owns the content that we put on Facebook and that Facebook gives some control to consumers over their content. But we know about the problems with Cambridge Analytica.

PALLONE: I know you changed your rules in 2014 and again this week, but you still allow third parties to have access to personal data. How can consumers have control over their data when Facebook doesn't have control over the data itself? That's my concern. Last question.

ZUCKERBERG: Congressman, what we allowed — what we allow with our developer platform is for people to choose to sign into other apps and bring their data with them. That's something a lot of people want to be able to do.

The reason why we built the developer platform in the first place was because we thought it would be great if more experiences that people had could be more social, so if you could have a calendar that showed your friends' birthdays; if you could have an address book that had pictures of your friends in it; if you could have a map that showed your friends' addresses on it.

In order to do that, you need to be able to sign into an app, bring some of your data and some of your friends' data. And that's what we built.

Now, since then, we have recognized that that can be used for abuse, too. So we've limited it, so now people can only bring their data when they go to an app.

But that's something that a lot of people do on a day-to-day basis — is sign into apps and websites with their — with Facebook. And that's something that we're ...

PALLONE: I still don't ...

(CROSSTALK)

WALDEN: We're going to have to move on to our next question.

PALLONE: Yeah, I know. I still think that there's not enough — people aren't empowered enough to really make those decisions in a positive way.

WALDEN: The chair now recognizes a former chairman of the committee, Mr. Barton of Texas, for four minutes.

REP. JOE BARTON (R-TEX.): Well, thank you. And thank you, Mr. Zuckerberg for being here. People need to know that you're here voluntarily. You're not here because you've been subpoenaed. So we appreciate that.

Sitting behind you — have a gentleman who used to be counsel for the committee, Mr. Jim Barnett. And, if he's affiliated with Facebook, you've got a good one. If he's not, he's just got a great seat. I don't know ...

(LAUGHTER)

... know what it is. I'm going to read you a question that I was asked. I got this through Facebook, and I've got dozens like this.

So, my first question: “Please ask Mr. Zuckerberg, why is Facebook censoring conservative bloggers such as Diamond and Silk? Facebook called them unsafe to the community. That is ludicrous. They hold conservative views. That isn't unsafe.” What's your response to ...

ZUCKERBERG: Congressman, in that specific case, our team made an enforcement error. And we have already gotten in touch with them to reverse it.

BARTON: Well, Facebook does tremendous good. When — when I met you in my office, eight years ago — you don't remember that. But I've got a picture of you when you had curly hair and Facebook had 500 million users. Now, it's got over 2 billion. That's a success story in — in anybody's book.

It's such an integral part of, certainly, young Americans' lives that you need to work with Congress and the community to ensure that it is a neutral, safe and, to the largest extent possible, private platform. Do you agree with that?

ZUCKERBERG: Congressman, I do agree that we should work to give people the fullest free expression that is possible. That's what — when I talk about giving people a voice, that's what I care about.

BARTON: Okay.

Let's talk about children. Children can get a Facebook account of their own, I believe, starting at age 13. Is that not correct?

ZUCKERBERG: Congressman, that's correct.

BARTON: Okay. Is there any reason that we couldn't have just a no-data-sharing policy, period, until you're 18? Just — if you're a child with your own Facebook account, until you reach the age of 18, you know, it's — it's — you know, you can't share anything.

It's — it's their data, their picture — it doesn't — it doesn't go anywhere. Nobody gets to scrape it; nobody gets to access it. It's absolutely, totally private. Well, it's — for children. What's wrong with that?

ZUCKERBERG: Congressman, we have a number of measures in place to protect minors specifically. We make it so that adults can't contact minors who they — they aren't already friends with. We make it so that certain content that may be inappropriate for minors, we don't show.

The reality that we see is that teens often do want to share their opinions publicly, and that's a service that ...

BARTON: Will we let them opt in to do that?

ZUCKERBERG: Yes, we do.

BARTON: But don't — you know, unless they specifically allow it, then don't allow it. That's my point.

ZUCKERBERG: Congressman, every time that someone chooses to share something on Facebook — you go to the app; right there, it says, “Who do you want to share with?” When you sign up for a Facebook account, it starts off sharing with just your friends.

If you want to share publicly, you have to specifically go and change that setting to be sharing publicly. Every time ...

BARTON: I'm — I'm about out of time. I — I actually use Facebook, and, you know, I know, if you take the time, you can go to your privacy and click on that. You can go to your settings and click on that.

You can pretty well set up your Facebook account to — to be almost totally private. But you have to really work at it. And my time's expired. Hopefully we can do some questions in writing as a follow-up.

Thanks, Mr. Chairman.

WALDEN: Absolutely. The chair now recognizes the gentleman from Illinois, Mr. Rush, for four minutes for questions.

REP. BOBBY L. RUSH (D-ILL.): Thank you, Mr. Chairman. Mr. Zuckerberg, welcome.

In the 1960s, our government, acting through the FBI and local police, maliciously tricked individuals and organizations into participating in something called COINTELPRO, which was a counterintelligence program where they tracked and shared information amongst civil rights activists, their political, social, city, even religious affiliations. And I personally was a victim of COINTELPRO.

Your organization, your methodology, in my opinion, is similar. You're truncating the basic rights of the American promise of life, liberty and the pursuit of happiness by the wholesale invasion and manipulation of their right to privacy.

Mr. Zuckerberg, what is the difference between Facebook's methodology and the methodology of the American political pariah, J. Edgar Hoover?

ZUCKERBERG: Congressman, this is an important question because I think people often ask what the difference is between surveillance and what we do. And I think that the difference is extremely clear, which is that, on Facebook, you have control over your information.

The content that you share, you put there. You can take it down at any time. The information that we collect, you can choose to have us not collect. You can delete any of it, and, of course, you can leave Facebook if you want.

I know of no surveillance organization that gives people the option to delete the data that they have, or even know what — what they're collecting.

RUSH: Mr. Zuckerberg, you should be commended that Facebook has grown so big, so fast. It is no longer the company that you started in your dorm room. Instead, it's one of — great American success stories.

That much influence comes with enormous social responsibility, on which you have failed to act and to protect and to consider. Shouldn't Facebook, by default, protect users' information? Why is the onus on the user to opt in to privacy and security settings?

ZUCKERBERG: Congressman, as I've said, every time that a person chooses to share something on Facebook, they're proactively going to the service and choosing that they want to share a photo, write a message to someone.

And, every time, there is a control right there — not buried in settings somewhere, but right there, when they're — when they're posting ...

RUSH: All right.

ZUCKERBERG: ... about who they want to share it with.

RUSH: Mr. Zuckerberg, I only have a few more seconds. In November 2017, (inaudible) reported that Facebook was — still allowed housing and work advertisements to systematically exclude advertisements to specific racial groups, an explicitly prohibited practice.

This is just one example where Facebook has allowed race — so race — race to improperly play a role. What has Facebook done, and what are you doing, to ensure that you are — that your targeted advertisements and other components of your platform are in compliance with federal laws such as the Civil Rights Act of 1968?

ZUCKERBERG: Congressman, since we learned about that, we've removed the option for advertisers to exclude ethnic groups from targeting.

RUSH: When did you do that?

WALDEN: The gentleman's time has expired.

We need to go now to the gentleman from Michigan, Mr. Upton, for four minutes.

REP. FRED UPTON (R-MICH.): Thank you, Mr. Chairman, and welcome to the committee.

A number of times in the last day or two, you've indicated that, in fact, you're now open to some type of regulation. And we know, of course, that you're the dominant social media platform without any true competitor, in all frankness. And you have hundreds, if not thousands, of folks that are — would be required to help navigate any type of regulatory environment.

Some would argue that a more regulatory environment might ultimately stifle new platforms and innovators some might describe as desperately needed competition; i.e., regulatory complexity helps protect those folks like you. It could create a harmful barrier to entry for some start-ups, particularly ones that might want to compete with you.

So should we policymakers up here be more focused on the needs of start-ups, over large incumbents? And what kind of policy regulation — regulatory environment would you want, instead of managing, maybe, a Fortune 500 company, if you were launching a start-up to — taking on the big guy?

ZUCKERBERG: Congressman, thank you, and let me say a couple of things on this. First, to your point about competition, the average American uses about eight different apps to communicate and stay connected to people.

So there's a lot of competition that we feel every day. And — and that — that's — that's an important force that — that we — that we definitely feel when running the company.

Second, on your point about regulation, the Internet is growing in importance around the world in people's lives, and I think that it is inevitable that there will need to be some regulation.

So my position is not that there should be no regulation. But I also think that you have to be careful about what regulation you put in place for a lot of the reasons that you're saying.

I think, a lot of times, regulation, by definition, puts in place rules that a company that is larger, that has resources like ours, can easily comply with, but that might be more difficult for a smaller start-up to — to comply with.

ZUCKERBERG: So I think that all things that need to be thought through very carefully when — when thinking through what — what rules we want to put in place.

UPTON: And, to follow up a question with — that Mr. Barton asked about Silk and Diamond — I don't know whether you know about this particular case — I have a former state rep who's running for state senate. He's the former Michigan Lottery commissioner, so he's a guy of — of fairly good political prominence.

He is a — he announced for state senate just in the last week, and he had what I thought was a rather positive announcement. It's — and I'll read to you precisely what it was.

“I'm proud to announce my candidacy for state senate. Lansing needs conservative west Michigan values, and, as our next state senator, I will work to strengthen our economy, limit government, lower our auto insurance rates, balance the budget, stop sanctuary cities, pay down government debt, be a pro-life, pro-2nd-Amendment lawmaker.”

And it was rejected. And the response from you all was it wasn't approved because it doesn't follow our advertising policies. We don't allow ads that contain shocking, disrespectful or sensational content, including ads that depict violence or threats of violence. I'm not sure where the threat was, based on what he tried to post.

ZUCKERBERG: Congressman, I'm not sure either. I'm not familiar with that specific case. It's quite possible that we made a mistake, and we'll follow up afterward to — on that.

UPTON: Okay.

ZUCKERBERG: Overall — yeah, we have — by the end of this year, we'll have about 20,000 people at the company who work on security and content-review-related issues.

But there's a lot of content flowing through the systems and a lot of reports, and, unfortunately, we don't always get these things right when people report it to us.

UPTON: Okay. Thank you.

WALDEN: Gentleman's time's expired.

Chair recognizes the gentlelady from California, Ms. Eshoo, for four minutes.

REP. ANNA G. ESHOO (D-CALIF.): Thank you, Mr. Chairman. Good morning, Mr. Zuckerberg.

First, I believe that our democratic institutions are undergoing a stress test in our country. And I believe that American companies owe something to America.

I think the damage done to our democracy, relative to Facebook and its platform being weaponized, are incalculable. Enabling the cynical manipulation of American citizens for the purpose of influencing an election is deeply offensive, and it's very dangerous. Putting our private information on offer without concern for possible misuses, I think, is simply irresponsible.

I invited my constituents, going into the weekend, to participate in this hearing today by submitting what they want to ask you. And so my questions are theirs.

And, Mr. Chairman, I'd like unanimous consent to place all of their questions in the record.

WALDEN: Without objection.

ESHOO: So these are a series of just yes-no questions.

Do you think you have a moral responsibility to run a platform that protects our democracy? Yes or no.

ZUCKERBERG: Congresswoman, yes.

ESHOO: Have users of Facebook who are caught up in the Cambridge Analytica debacle been notified?

ZUCKERBERG: Yes. We are starting to notify people this week. We started Monday, I believe.

ESHOO: Will Facebook offer to all of its users a blanket opt-in to share their privacy data with any third-party users?

ZUCKERBERG: Congresswoman, yes. That's how our platform works. You have to opt in to sign in to any app before you use it.

ESHOO: Well, let — let me just add that it is a minefield in order to do that. And you have to make it transparent, clear, in pedestrian language, just once, “This is what we will do with your data. Do you want this to happen, or not?”

So I — I think that this is being blurred. I — I think you know what I mean by it. Are you aware of other third-party information mishandlings that have not been disclosed?

ZUCKERBERG: Congresswoman, no, although we are currently going through the process of investigating every ...

(CROSSTALK)

ESHOO: So you're not sure?

ZUCKERBERG: ... that had access to a large amount of data.

ESHOO: What does that mean?

ZUCKERBERG: It means that we're going to look into every app that had a large amount of access to data in the past, before we lock down the platform. I ...

ESHOO: So you're not aware.

(CROSSTALK)

ZUCKERBERG: ... because there are tens of thousands of apps, we will find some ...

(CROSSTALK)

ESHOO: All right. I — I only have four minutes.

ZUCKERBERG: ... and, when we find them ...

ESHOO: Was your data included in the data sold to the malicious third parties? Your personal data?

ZUCKERBERG: Yes.

ESHOO: It was.

Are you willing to change your business model in the interest of protecting individual privacy?

ZUCKERBERG: Congresswoman, we are — have made and are continuing to make changes to reduce the amount of ...

ESHOO: No, are you willing to change your business model in the interest of protecting individual privacy?

ZUCKERBERG: Congresswoman, I'm not sure what that means.

ESHOO: Well, I'll follow up with you on it.

When did Facebook learn that Cambridge Analytica's research project was actually for targeted psychographic political campaign work?

ZUCKERBERG: Congresswoman, it might be useful to clarify what actually happened here. A developer does research ...

(CROSSTALK)

ESHOO: Well, no. I — I don't have time for a long answer, though. When did Facebook learn that? And, when you learned it, did you contact their CEO immediately? And, if not, why not?

ZUCKERBERG: Congresswoman, yes. When we learned in 2015 that a Cambridge University researcher associated with the academic institution that built an app that people chose to share their data with ...

ESHOO: We know what happened with them. But I'm asking you.

ZUCKERBERG: Yes. I'm answering your question.

ESHOO: Yes. All right.

ZUCKERBERG: When — when we learned about that, we ...

ESHOO: So, in 2015, you learned about it?

ZUCKERBERG: Yes.

ESHOO: And you spoke to their CEO immediately?

ZUCKERBERG: We shut down the app.

ESHOO: Did you speak to their CEO immediately?

ZUCKERBERG: We got in touch with them, and we asked them to — to — we commanded that they delete any of the data that they had, and their chief data officer told us that they had.

WALDEN: The gentlelady's time is expired.

ESHOO: Thank you.

WALDEN: Chair now recognize gentleman from Illinois, Mr. Shimkus, for four minutes.

REP. JOHN SHIMKUS (R-ILL.): Thank you, Mr. Chairman. Thank you for being here, Mr. Zuckerberg.

Two things: First of all, I want to thank Facebook. You streamlined our Congressional Baseball Game last year. We've got the managers here, and I was told that, because of that, we raised an additional $100,000 for D.C. literacy and feeding kids and stuff.

So that's a — the other thing is, I — I usually put my stuff up on the TV. I don't want to do it very much, because my dad — and he'd be mad if he went international, like you are — and he's been on Facebook for a long time. He's 88. It's been good for connecting with kids and grandkids.

I just got my mother involved on an iPad and — because she can't handle a keyboard. And so — and I did this last week. So the — in this world — activity — I still think there is a positive benefit for my parents to be engaged on this platform.

So — but there's issues, as being raised today. And so I'm going to go into a couple of those. Facebook made — developed access to user and friend data back in — your main update was in 2014. So the question is, what triggered that update?

ZUCKERBERG: Congressman, this is — this is an important question to clarify.

So, in 2007, we launched the platform in order to make it so that people could sign in to other apps, bring some of their information and some of their friends' information, to have social experiences.

This created a lot of innovative experiences — new games, companies like Zynga. There were companies that you're — that you're familiar with, like Netflix and Spotify — had integrations with this that allowed social experiences in their apps.

But, unfortunately, there were also a number of apps that used this for abuse, to collect people's data ...

SHIMKUS: So, if I can interrupt, it's just — you identified that there was possibly social scraping going on?

ZUCKERBERG: Yeah, there was abuse. And that's why, in 2014, we took the step of fundamentally changing how the platform works. So, now, when you sign into an app, you can bring your information, and, if a friend has also signed into the app, then we'll — then the app can know that you're friends, so you can have a social experience in that app.

But, when you sign into an app, it now no longer brings information from other people.

SHIMKUS: Yeah. Let me go to your announcement of audits. Who's going to conduct the audit? We're talking about — are there other Cambridge Analytics [sic] out there?

ZUCKERBERG: Yes, Congressman. Good question. So we're going to start by doing an investigation, internally, of every single app that had access to a large amount of information, before we lock down the platform.

If we detect any suspicious activity at all, we are working with third-party auditors — I imagine there will have to be a number of them, because there are a lot of apps — and they will conduct the audit for us.

SHIMKUS: Yeah, I think we would hope that you would bring in a third party to help us ...

ZUCKERBERG: Yes.

SHIMKUS: ... clarify and have more confidence.

The last question I have is, in yesterday's hearing, you talked a — a little about Facebook tracking in different scenarios, including logged-off users. Can you please clarify as — how that works? And how does tracking work across different devices?

ZUCKERBERG: Yes, Congressman. Thank you for giving me the opportunity to clarify that.

So one — one of the questions is — is, what information do we track, and why, about people who are not signed into Facebook. We track certain information for security reasons and for ads reasons.

For security, it's to make sure that people who are not signed into Facebook can't scrape people's public information. You can — even when you're not signed in, you can look up the information that people have chosen to make public on their page, because they wanted to share it with everyone. So there's no reason why you should have to be logged in.

But, nonetheless, we don't want someone to be able to go through and download every single public piece of information. Even if someone chose to make it public, that doesn't mean that it's good to allow someone to aggregate it. So, even if someone isn't logged in, we track certain information, like how many pages they're accessing, as a security measure.

The second thing that we do is we provide an ad network that third-party websites and apps can run in order to help them make money. And those ads — you know, similar to what Google does and what the rest of the industry does — it's not limited to people who are just on Facebook.

So, for the purposes of that, we may also collect information to make it so that those ads are more relevant and work better on those websites. There's a control that — for that second class of information around ad targeting — anyone can turn off, has complete control over it.

For obvious reasons, we do not allow people to turn off the — the measurement that we do around security.

WALDEN: The gentleman's time has expired.

We now turn to the gentleman from New York, Mr. Engel, for four minutes.

REP. ELIOT L. ENGEL (D-N.Y.): Thank you, Mr. Chairman.

Mr. Zuckerberg, you have roots in my district, the 16th congressional district of New York. I know that you attended Ardsley High School and — and grew up in Westchester County.

As you know, Westchester has a lot to offer, and I hope that you might commit to returning to Westchester County, perhaps to do a forum on — on this and some other things. I hope you would consider that. We'll — we'll be in touch — in touch with you. But I know that Ardsley High School's very proud of you.

You mentioned yesterday that Facebook was deceived by Aleksandr Kogan when he sold user information to Cambridge Analytica. Does Facebook, therefore, plan to sue Aleksandr Kogan, Cambridge University or Cambridge Analytica, perhaps, for unauthorized access to computer networks, exceeding access to computer networks or breach of contract? And why or why — why not?

ZUCKERBERG: Congressman, it's something that we're looking into. We already took action by banning him from the platform, and we're going to be doing a full audit to make sure that he gets rid of all the data that — that he — that he has, as well.

To your point about Cambridge University, what we've found now is that there's a whole program associated with Cambridge University where a number of researchers, not just Aleksandr Kogan — although, to our current knowledge, he's the only one who's sold the data to Cambridge Analytica — there were a number of other researchers who were building similar apps.

So we do need to understand whether there was something bad going on at Cambridge University overall that will require a stronger action from us.

ENGEL: You mentioned before, in your remarks, hate speech. We've seen the scale and reach of extremism balloon in the last decade, partially because of the expansion of social platforms.

Whether it's a white supremacist rally in Charlottesville that turned violent, or it's ethnic cleansing in Burma that resulted in the second-largest refugee crisis in the world, are you aware of any foreign or domestic terrorist organizations, hate groups, criminal networks or other extremist networks that have scraped Facebook user data?

And, if they have, and if they do it in the future, how would you go about getting it back or deleting it?

ZUCKERBERG: Congressman, we're not aware of any specific groups like that, that have — that have engaged in this. We are, as I've said, conducting a full investigation of any apps that had access to a large amount of data. And, if we find anything suspicious, we'll tell everyone affected.

We do not allow hate groups on Facebook, overall. So, if — if there's a group that — their primary purpose or — or a large part of what they do is spreading hate, we will ban them from the platform, overall.

ENGEL: So do you adjust your — your algorithms to prevent individuals interested in violence or nefarious activities from being connected with other like-minded individuals?

ZUCKERBERG: Sorry. Can you repeat that?

ENGEL: Do you adjust your algorithms to prevent individuals interested in violence or bad activities from being connected with other like-minded individuals?

ZUCKERBERG: Congressman, yes. That's certainly an important thing that — that we need to do.

ENGEL: Okay. And, finally, let me say this. Many of us are very angry about Russian influence in the — in the 2016 presidential elections and Russian influence over our presidential elections.

Does Facebook have the ability to detect when a foreign entity is attempting to buy a political ad? And is that process automated? Do you have procedures in place to inform key government players when a foreign entity is attempting to buy a political ad or when it might be taking other steps to interfere in an election?

ZUCKERBERG: Congressman, yes. This is an extremely important area. After we were slow to identify the Russian information operations in 2016, this has become a top priority for our company — to prevent that from ever happening again, especially this year, in 2018, which is such an important election year with the U.S. midterms, but also major elections in India, Brazil, Mexico, Hungary, Pakistan and a number of other places.

So we're doing a number of things that — that I'm — that I'm happy to talk about, or follow up with afterward, around deploying new A.I. tools that can proactively catch fake accounts that Russia or others might create to spread misinformation.

And one thing that I'll — that I'll end on here, just because I — I know we're — we're running low on time, is, since the 2016 election, there have been a number of significant elections, including the French presidential election, the German election and, last year, the U.S. Senate Alabama special election.

ZUCKERBERG: And the A.I. tools that we deployed in those elections were able to proactively take down tens of thousands of fake accounts that may have been trying to do the activity that you're — that you're talking about. So our tools are getting better.

For as long as Russia has people who are employed, who are trying to perpetrate this kind of interference, it will be hard for — for us to guarantee that we're going to fully stop everything.

But it's an arms race, and I think that we're making ground and are — are doing better and better and are confident about how we're going to be able to do ...

(CROSSTALK)

WALDEN: Gentleman's time has expired.

ENGEL: Thank you.

WALDEN: Chair recognizes the chairman of the Health Subcommittee, Mr. — Dr. Burgess of Texas, for four minutes.

REP. MICHAEL C. BURGESS (R-TEX.): Thank you, Mr. Chairman, and thanks to our witness for — for being here today.

Mr. Chairman, I have a number of articles that I ask unanimous consent to insert into the record. I know I won't have time to get to all of my questions.

WALDEN: Without objection. And we put the slide up you requested.

BURGESS: And so I'm going to be submitting some questions for the record that are referencing these articles. One is “Friended: How the Obama Campaign Connected with Young Voters,” by Michael Scherer; “We Already Know How to Protect Ourselves from Facebook,” and I hope I get this name right — Zeynep Tufekci; and “It's Time to Break Up Facebook,” by Eric Wilson, who, in the interest of full disclosure ...

WALDEN: Without objection.

BURGESS: ... was a former staffer. And I will be referencing those articles in — in some written questions.

I consulted my technology guru, Scott Adams, in the form of Dilbert, going back 21 years ago. And, when you took the shrink-wrap off of a piece of software that you bought, you were automatically agreeing to be bound by the terms and conditions.

So we've gone a long way from taking the shrimp wrap — shrink wrap off of a — off of an app. But I don't know that things have changed so much.

And, I guess, does Facebook have a position — a — a position that you recommend for elements of a company's terms and conditions that you encourage consumers to look at before they click on the acceptance?

ZUCKERBERG: Congressman, yes.

I think that it's really important for the service that people understand what they are doing and signing up for and how the service works. We have laid out all of what we do in the terms of service, because that's what is legally required of us.

BURGESS: Let me just ask you, because we're going to run short on time, do you have — have you laid out for people what it — would be indicative of a good actor, versus a less-than-good actor, in someone who's developed a — one of these applications?

ZUCKERBERG: Congressman, yes.

We have a developer terms of service, which is separate from the normal terms of service for — for individuals using the service.

BURGESS: Is the average consumer able to determine what elements would indicate poor or weak consumer protections, just by their evaluation of the terms and conditions? Do you think that's possible?

ZUCKERBERG: Congressman, I'm not sure what you mean by that.

BURGESS: Well, can you — can someone — can the average person — the average layperson look at the terms and conditions and make the evaluation, “Is this a strong enough protection for me to enter into this arrangement?”

Look, I'm as bad as anyone else. I see an app, I want it, I download it, I breeze through the stuff. Just take me to the — to the good stuff in the app. But, if a consumer wanted to know, could they know?

ZUCKERBERG: Congressman, I think you're raising an important point, which is that I think, if someone wanted to know, they could. But I think that a lot of people probably just accept terms of service without taking the time to read through it.

I view our responsibility not as just legally complying with laying it out and getting that consent, but actually trying to make sure that people understand what's happening throughout the product.

That's why, every single time that you share something on Facebook or one of our services, right there is a control in line, where you control who — who you want to share with, because I don't just think that this is about a terms of service. It's contextual.

You — you want to present people with the information about what — what they might be doing and give them the relevant controls in line, at the time that they're making those decisions, not just have it be in the background sometime, or up front — make a one-time decision.

BURGESS: Yeah, let me move onto something else.

Mr. Pallone brought up the issue of — he wanted to see more regulation. We actually passed a bill through this committee last Congress dealing with data breach notification — not so much for Facebook, but for the credit card breaches — a good bill.

Many of the friends on the other side of the dais voted against it. But it was Ms. Blackburn's bill, and I think it's one we should consider again, in light of what is going on here.

But you also signed a consent decree back in 2011. And, you know, when I read through that consent decree, it's — it's pretty explicit. And there is a significant fine of $40,000 per violation, per day. And, if you've got 2 billion users, you can see how those fines would mount up pretty quickly.

So, in the course of your audit, are you — are you extrapolating data for the people at the Federal Trade Commission for that — the terms and conditions of the consent decree?

WALDEN: It's time.

ZUCKERBERG: That is — I'm not sure what you mean by extrapolating data.

BURGESS: Well, you're — you've said — you've referenced there are audits that are ongoing. Are you making that information from those audits available to our friends at the — at the agency, at the Federal Trade Commission?

ZUCKERBERG: Congressman, as you know, the FTC is investigating this. And we are certainly going to be complying with them and working with them on that investigation.

WALDEN: Gentleman's time has expired.

Chair recognizes the gentleman from Texas, Mr. Green, for four minutes.

REP. GENE GREEN (D-TEX.): Thank you, Mr. Chairman, and welcome to our committee.

I want to follow up on what my — my friend from North Texas talked about on — on his cartoon. Next month, the General Data Protection Regulation — the GDPR — goes into effect in the European Union.

The GDPR is pretty prescription on — prescriptive on how companies treat consumer data. And it makes it clear that consumers need to be in control of their own data.

Mr. Zuckerberg, Facebook has committed to abiding to these consumer protections in Europe, and you face large penalties if they don't. In recent days, you've said that Facebook intends to make the same settings available to users everywhere, not only in Europe.

Did I understand correctly that Facebook would not only make the same settings available, but that it will make the same protections available to Americans that they will the Europeans?

ZUCKERBERG: Yes, Congressman. All the same controls will be available around the world.

GREEN: Okay. And you commit today that Facebook will extend the same protections to Americans that European users — users will receive under the GDPR?

ZUCKERBERG: Yes, Congressman. We believe that everyone around the world deserves good privacy controls. We've had a lot of these controls in place for years. The GDPR requires us to do a few more things, and we're going to extend that to the world.

GREEN: There are many requirements in the GDPR, so I'm just going to focus on a few of them.

The GDPR requires that the company's request for user consent — to be requested in a clear and concise way, using language that is understandable, and be clearly distinguishable from other pieces of information, including terms and conditions.

How will that requirement be implemented in the United States?

ZUCKERBERG: Congressman, we're going to put, at the top of everyone's app when they sign in, a tool that walks people through the settings and gives people the choices and — and asks them to make decisions on how they want their settings set.

GREEN: One of the GDPR's requirements is data portability. Users must be able to — permitted to request a full copy of their information and be able to share that information with any companies that they want to.

I know Facebook allows users in the U.S. to download their Facebook data. Does Facebook plan to use the currently existing ability of users to download their Facebook data as the means to comply with the GDPR's data portability requirement?

ZUCKERBERG: Congressman, I think we may be updating it a little bit. But, as you say, we've had the ability to download your information for years now. And people have the ability to see everything that — that they have in Facebook, to take that out, delete their account and move their data anywhere that they want.

GREEN: Does that download file include all the information Facebook has collected about any given individual?

In other words, if I download my Facebook information, is there other information accessible to you within Facebook that I wouldn't see on that document, such as browsing history or other inferences that Facebook has drawn from users for advertising purposes?

ZUCKERBERG: Congressman, I believe that all of your information is in that — that file.

GREEN: GDPR also gives users the right to object to the processing of their personal data for marketing purposes, which, according to Facebook's website, includes custom micro-target audiences for advertising.

Will the same right be object — to object be available to Facebook users in the United States? And how will that be implemented?

ZUCKERBERG: Congressman, I'm not sure how we're going to implement that yet. Let me follow up with you on that.

GREEN: Okay. Thank you, Mr. Chairman.

And again, is the small — Facebook conducted, a couple years ago, an effort in our district in Houston for our small businesses. And it was one of the most successful outreach I've seen. So I appreciate that outreach to helping small businesses use Facebook to market their products.

Thank you, Mr. Chairman.

WALDEN: Thank the gentleman.

The chair now recognizes the gentlelady from Tennessee, Ms. Blackburn, for four minutes.

REP. MARSHA BLACKBURN (R-TENN.): Thank you, Mr. Chairman.

Mr. Zuckerberg, I tell you, I think your cozy community, as Dr. Mark Jameson recently said, is beginning to look a whole lot like “The Truman Show,” where people's identities and relationships are made available to people that they don't know. And then that data is crunched and it is used and they are fully unaware of this.

So I've got to ask you, I think what we're getting to here is, who owns the virtual you? Who owns your presence online?

And I'd like for you to comment. Who do you think owns an individual's presence online? Who owns their virtual you? Is it you or is it them?

ZUCKERBERG: Congresswoman, I believe that everyone owns their own content online. And that's — the first line of our terms of service, if you read it, says that.

BLACKBURN: And where does privacy rank as a corporate value for Facebook?

ZUCKERBERG: Congresswoman, giving people control of their information and how they want to set their privacy is foundational to the whole service. It's not just a — kind of an add-on feature, something we have to ...

BLACKBURN: Okay.

ZUCKERBERG: ... comply with.

BLACKBURN: Well ...

ZUCKERBERG: The reality is, if you have a photo — if you just think about this in your day-to-day life ...

BLACKBURN: No, I can't let you filibuster right now.

A constituent of mine who's a benefits manager brought up a great question in a meeting at her company last week. And she said, you know, health care, you've got HIPAA, you've got Gramm-Leach-Bliley, you've got the Fair Credit Reporting Act. These are all compliance documents for privacy for other sectors of the industry. She was stunned, stunned, that there are no privacy documents that apply to — to you all.

And we've heard people say that — you know, and you've said you're considering, maybe you need more regulation. What we think is, we need for you to look at new legislation. And you're hearing there'll be more bills brought out in the next few weeks. But we have had a bill. The BROWSER Act, and I'm certain that you're familiar with this, is bipartisan. And I thank Mr. Lipinski and Mr. Lance and Mr. Flores for their good work on this legislation.

We've had it for over a year and certainly we've been working on this issue for about four years. And what this would do is have one regulator, one set of rules for the entire ecosystem.

And will you commit to working with us to pass privacy legislation, to pass the BROWSER Act? Will you commit to doing that?

ZUCKERBERG: Congresswoman, I'm not directly familiar with the details of what you just said. But I certainly think that regulation in this area ...

BLACKBURN: Okay, let's get — let's get familiar with the details.

As you have heard, we need some rules and regulations. This is only 13 pages. The BROWSER Act is 13 pages, so you can easily become familiar with it. And we would appreciate your help.

And I've got to tell you, as Mr. Green just said, as you look at the E.U. privacy policies, you're already doing much of that, if you're doing everything you claim. Because you will have to allow consumers to control their data, to change, to erase it.

You have to give consumers opt-in so that mothers know — my constituents in Tennessee want to know that they have a right to privacy. And we would hope that that's important to you all.

I want to move on and ask you something else. And please get back to me once you've reviewed the BROWSER Act. I would appreciate hearing from you.

We've done one hearing on the algorithms. I chair Communications and Technology Subcommittee here. We're getting ready to do a second one on the algorithms. We're going to do one next week on prioritization.

So I'd like to ask you, do you subjectively manipulate your algorithms to prioritize or censor speech?

ZUCKERBERG: Congresswoman, we don't think about what we're doing as censoring speech.

I think that there are — there are types of content like terrorism that I think that we all agree we do not want to have on our service. So we build systems that can identify those and can remove that content, and we're very proud of that work.

BLACKBURN: Let me tell you something right now: I — Diamond and Silk is not terrorism.

WALDEN: Gentlelady's time's expired.

Chair recognizes gentlelady from Colorado, Ms. DeGette, for four minutes.

REP. DIANA DEGETTE (D-COLO.): Thank you very much, Mr. Chairman.

Mr. Zuckerberg, we appreciate your contrition. And we appreciate your commitment to resolving these past problems.

From my perspective, though, and my colleagues on both sides of the aisle in this committee, we're interested in looking forward to preventing this kind of activity; not just with Facebook but with others in your industry.

And as has been noted by many people already, we've been relying on self-regulation in your industry for the most part. We're trying to explore what we can do to prevent further breaches.

So I'm going to ask you a whole series of fairly quick questions. They should only require yes-or-no answers.

Mr. Zuckerberg, at the end of 2017, Facebook had a total shareholder equity of just over $74 billion. Is that correct?

ZUCKERBERG: Sorry, Congresswoman, I'm not familiar with ...

DEGETTE: At the end of 2017, Facebook had a total shareholder equity of over $74 billion, correct?

ZUCKERBERG: Of over that?

DEGETTE: That's correct. You're the CEO, do you know ...

ZUCKERBERG: The market cap of the company was greater than that, yes.

DEGETTE:

Greater than $74 billion.

Last year, Facebook earned a profit of $15.9 billion on $40.7 billion in revenue, correct? Yes or no.

ZUCKERBERG: Yes.

DEGETTE: Now, since the revelations surrounding Cambridge Analytica, Facebook has not noticed a significant increase in users deactivating their accounts. Is that correct?

ZUCKERBERG: Yes.

DEGETTE: Now, since the revelations surrounding Cambridge Analytica, Facebook has also not noticed a decrease in user interaction on Facebook. Correct?

ZUCKERBERG: Yes, that's correct.

DEGETTE: Okay. Now, I want to take a minute to talk about some of the civil and regulatory penalties that we've been seeing.

I'm aware of two class-action lawsuits that Facebook has settled relating to privacy concerns: Lane v. Facebook was settled in 2010. That case resulted in no money being awarded to Facebook users. Is that correct?

ZUCKERBERG: Congresswoman, I'm not familiar with the details of that.

DEGETTE: Do you — you're — you're the CEO of the company, correct?

ZUCKERBERG: Yes.

DEGETTE: Now, there — this — this major lawsuit was settled. Do you know — do you know about the lawsuit?

ZUCKERBERG: Congresswoman, I — I get briefed on — on these things ...

(CROSSTALK)

DEGETTE: Do you know about this lawsuit, Lane v. Facebook? Yes or no?

ZUCKERBERG: I'm not familiar with the details of it.

DEGETTE: Okay. If you can supplement — I'll just tell you, there was this lawsuit, and the users got nothing.

In another case, Fraley v. Facebook, it resulted in a 2013 settlement fund of $20 million being established, with $15 individual payment — payouts to Facebook users, beginning in 2016. Is that correct?

ZUCKERBERG: Congresswoman, I'm not familiar with ...

(CROSSTALK)
DEGETTE: You don't know about that one either.

ZUCKERBERG: I — I ...

DEGETTE: Okay. Well, I'll tell you it happened.

ZUCKERBERG: ... I discuss them with — with our team, but I don't remember the exact details of them.

DEGETTE: Okay. Now, as the result of a 2011 FTC investigation into Facebook's privacy policy — do you know about that one?

ZUCKERBERG: The FTC investigation?

DEGETTE: Yes.

ZUCKERBERG: Yes.

DEGETTE: Okay. You entered into a consent decree with the FTC which carried no financial penalty for Facebook. Is that correct?

ZUCKERBERG: Congresswoman, I don't remember if we had a financial penalty.

DEGETTE: You're the CEO of the company, you entered into a consent decree, and you don't remember if you had a financial penalty?

ZUCKERBERG: I — I remember the consent decree. The consent decree is extremely important to how we operate the company.

DEGETTE: Yes. I would think a financial penalty would be, too.

Okay, well, the reason you probably don't remember is because the FTC doesn't have the authority to issue financial penalties for first-time violations.

The reason I'm asking these questions, sir, is because we continue to have these abuses and these — and these data breaches, but, at the same time, it doesn't seem like future activities are prevented.

And so I think one of the things that we need to look at in the future, as we work with you and others in the industry, is putting really robust penalties in place in case of — of improper actions.

And that's why I ask these questions.

WALDEN: The gentlelady's time is expired.

Chair recognizes the gentleman from Louisiana, the whip of the House, Mr. Scalise, for four minutes.

REP. STEVE SCALISE (R-LA.): Thank you, Mr. Chairman. And, Mr. Zuckerberg, I appreciate you coming here. I know, as some of my colleagues mentioned, you came here voluntarily, and we appreciate the opportunity to have this discussion, because, clearly, what your company's been able to do has revolutionized the way that people can connect.

And there's a tremendous benefit to our country. Now it's a worldwide platform, and it's — it's helped create a shortage of computer programmers. So, as a former computer programmer, I think we would both agree that we need to encourage more people to go into the computer sciences, because our country is a world leader, thanks to your company and so many others.

But it obviously raises questions about privacy and data and how the data is shared and what is a user's expectation of where that data goes. So I want to ask a few questions.

First, would you agree that we need more computer programmers and people to go into that field?

ZUCKERBERG: Congressman, yes.

SCALISE: That's a public service announcement we just made, so appreciate you ...

(LAUGHTER)

... joining me in that.

And Mr. Shimkus's question — it was really a follow-up to a question yesterday that — that you weren't able to answer, but it was dealing with how Facebook tracks users, especially after they log off.

And you had said, in relation to Congressman Shimkus's question, that there is data mining, but it goes on for security purposes. So my question would be, is that data that is mined for security purposes also used to sell as part of the business model?

ZUCKERBERG: Congressman, I believe that those are — are — that we collect different data for those. But I can follow up on the details of — of that.

SCALISE: All right. If you could follow up, I would appreciate that.

Getting into this — this new realm of content review, I know some of the people that work for Facebook — Campbell Brown said, for example, “This is changing our relationship with publishers and emphasizing something that Facebook has never done before: It's having a point of view.”

And you mentioned the Diamond and Silk example, where there — you — you, I think, described it as a mistake. Were the people who made that mistake held accountable in any way?

ZUCKERBERG: Congressman, let me follow up with you on that. That situation developed while I was here, preparing to testify, so I'm not ...

SCALISE: Okay.

(CROSSTALK)

ZUCKERBERG: ... details on it.

SCALISE: I do want to ask you about a study that was done dealing with the algorithm that Facebook uses to describe what is fed to people through the news feed.

And what they found was, after this new algorithm was implemented, that there was a tremendous bias against conservative news and content, and a favorable bias toward liberal content.

And, if you can look at that, that shows a 16-point disparity, which is concerning. I would imagine you're not going to want to share the algorithm itself with us. I'd encourage you if you wanted to do that. But who develops the algorithm?

I wrote algorithms before, and you can determine whether or not you want to write an algorithm to sort data, to compartmentalize data; but you can also put a bias in, if that's the directive. Was there a directive to put a bias in? And, first, are you aware of this bias that many people have looked at and analyzed and seen?

ZUCKERBERG: Congressman, this is a really important question. There is absolutely no directive in any of the changes that we make to have a bias in anything that we do. To the contrary, our goal is to be a platform for all ideas ...

(CROSSTALK)

SCALISE: And I know we're — we're almost out of time. So, if you can go back and look and determine if there was a bias — whoever developed that software — you have 20,000 people that work on some of this data analysis — if you can look and see if there is a bias and let us know if there is and what you're doing about it, because that is disturbing, when you see that kind of disparity.

Finally, there has been a lot of talk about Cambridge and what they've done and the last campaign. In 2008 and 2012, there was also a lot of this done.

One of the lead digital heads of the Obama campaign said recently, “Facebook was surprised we were able to suck out the whole social graph, but they didn't stop us once they realized that was what we were doing. They came to office in the days following the election recruiting and were very candid that they allowed us to do things they wouldn't have allowed someone else to do, because they were on our side.”

That's a direct quote from one of the heads of the Obama digital team. What — what would she mean by they — Facebook — were on our side?

ZUCKERBERG: Congressman, we didn't allow the Obama campaign to do anything that any developer on the platform wouldn't have otherwise been able to do.

(CROSSTALK)

SCALISE: So she was making an inaccurate statement, in your point of view?

ZUCKERBERG: Yes, I ...

(CROSSTALK)

WALDEN: Gentleman's time has expired.

SCALISE: ... the comments and look forward to those answers. Yield back the balance of my time.

WALDEN: Chair now recognizes the gentleman from Pennsylvania, Mr. Doyle, for four minutes.

REP. MIKE DOYLE (D-PA.): Thank you, Mr. Chairman. Mr. Zuckerberg, welcome.

Facebook uses some of the most advanced data processing techniques and technologies on the planet, correct?

(CROSSTALK)

ZUCKERBERG: Congressman, we pride ourselves on — on doing good technical work, yes.

DOYLE: Thank you. And — and you use these technologies to flag spam, identify offensive content and track user activity, right?

ZUCKERBERG: Among other things.

DOYLE: But not — 2015 when, the Guardian first reported on Cambridge Analytica using Facebook user data — was that the first time Facebook learned about these allegations?

ZUCKERBERG: Congressman, in 2015, when we heard that the developer on our platform, Aleksandr Kogan ...

DOYLE: Was that the first time you heard about it, when it was ...

ZUCKERBERG: That — that Aleksandr Kogan had ...

DOYLE : ... reported by The Guardian?

ZUCKERBERG: ... sold data to Cambridge Analytica?

DOYLE: When The Guardian made the report, was that the first time you heard about it?

ZUCKERBERG: Yes.

DOYLE: Thank you.

So the — you weren't tuning  — learn about these violations through the press?

ZUCKERBERG: Congressman, sometimes we do. I generally think that ...

DOYLE: Let me ask you this. You have the capability to audit developers' use of Facebook user data and — and do more to prevent these abuses. But the problem at Facebook not only persisted; it proliferated.

In fact, relatives (sic) to other types of problems you had on your platform, it — it seems as though you turned a blind eye to this. Correct?

ZUCKERBERG: Congressman, I disagree with that assessment. I do think that, going forward, we need to take a more proactive view of — of policing what the developers do. But, looking back, we've had an app review process. We investigate ...

DOYLE: But, Mr. Zuckerberg ...

ZUCKERBERG: ... tens of thousands of apps a year.

DOYLE: ... it seems to us that — that — it seems like you were more concerned with attracting and retaining developers on your platform than you were with ensuring the security of Facebook user data.

Let me switch gears. Your company is subject to a 20-year consent decree with the FTC since 2011. Correct?

ZUCKERBERG: Congressman, we have a consent decree, yes.

DOYLE: And that decree emerged out of a number of practices that Facebook engaged in that the FTC deemed to be unfair and deceptive.

One such practice was making Facebook users' private information public without sufficient notice or consent; claiming that Facebook certified the security and integrity of certain apps when, in fact, it did not; and enabling developers to access excessive information about a user and their friends. Is that correct?

ZUCKERBERG: Congressman, I'm not — I'm not familiar with all of the things that the FTC said, although I'm very familiar with the FTC ...

DOYLE: But these were part of the — the consent decree.

(CROSSTALK)

ZUCKERBERG: ... order, itself.

DOYLE: So I think — I'm — I'm just concerned that, despite this consent decree, Facebook allowed developers access to an unknown number of user profiles on Facebook for years — potentially hundreds of million, potentially more — and not only allowed, but partnered with individuals and app developers such as Aleksandr Kogan, who turned around and sold that data on the open market and to companies like Cambridge Analytica.

Mr. Zuckerberg, you've said that you plan to audit tens of thousands of developers that may have improperly harvested Facebook user data. You also said that you planned to give all Facebook users access to some user controls that will be made available in the E.U. under the GDPR.

But it strikes me that there's a real trust gap here. This developer data issue is just one example. But why should we trust you to follow through on these promises when you have demonstrated repeatedly that you're willing to flout both your own internal policies and government oversight when the needs suit you?

ZUCKERBERG: Congressman, respectfully, I disagree with that characterization. We've had a review process for apps for years. We've reviewed tens of thousands of apps a year and taken action against a number of them.

Our process was not enough to catch a developer who sold data ...

DOYLE: I see my time is almost over.

ZUCKERBERG: ... that they had in their ...

DOYLE: I just want to say, Mr. Chairman ...

(CROSSTALK)

ZUCKERBERG: ... outside of our system.

DOYLE: ... that, to my mind, the only way we're going to close this trust gap is through legislation that creates and empowers a sufficiently resourced expert oversight agency with rulemaking authority to protect the digital privacy and ensure ...

WALDEN: Gentleman's ...

DOYLE: ... that companies protect our users' data. With that, I yield back.

WALDEN: ... Gentleman's time's expired.

Chair recognizes the chairman of the Subcommittee on Digital Commerce and Consumer Protection, Mr. Latta of Ohio, for four minutes.

REP. ROBERT E. LATTA (R-OHIO): Well thank you, Mr. Chairman. And — and, Mr. Zuckerberg, thanks very much for being with us today.

First question I have is, can you tell the Facebook users that the Russians and the Chinese have not used the same methods as other third parties to scrape the entire social network for their gain?

ZUCKERBERG: Congressman, we have not seen that activity.

LATTA: None at all?

ZUCKERBERG: I — not that I am aware of.

LATTA: Okay.

Let me ask this question. You know, it's a little bit that's been going on — when you made your opening statement in regards to what you'd like to see done with the — with the company and — and steps going — moving forward, there's been a couple questions, you know, about that you're going to be investigating the apps.

How many apps are there out there that you'd have to investigate?

ZUCKERBERG: There are tens of thousands of apps that had access to a large amount of people's information before we locked down the platform in 2014. So we're going to do an investigation that first involves looking at their patterns of API access and what those companies were doing.

And then, if we find anything suspicious, then we're going to bring in third-party auditors to go through their technical and physical systems to understand what they did.

And, if they — we find that they misused any data, then we'll ban them from our platform, make sure they delete the data and tell everyone affected.

LATTA: Just to follow up on that, then, how long would it take to then investigate each of those apps, once you're doing that? Because, again, when you're talking about tens of thousands and you're going through that entire process, then how long will it take to go through each one of those apps?

ZUCKERBERG: Yes, Congressman. It's going to take many months to do this full process.

LATTA: Okay.

ZUCKERBERG: And it's going to — it's going to be an expensive process with a lot of auditors. But we think that this is the right thing to do at this point.

You know, before, we'd thought that, when developers told us that they weren't going to sell data, that that was — that that was a good representation. But one of the big lessons that we've learned here is that, clearly, we cannot just take developers' word for it. We need to go and enforce that.

LATTA: Okay. We were talking about audits, as there have been some questions about this. On the audits, in 2011, Facebook signed — it did sign that consent order with the Federal Trade Commission for the privacy violations.

Part of that consent order requires Facebook to submit third-party privacy audits to the FTC every two years. First, are you aware of the audits? And, second, why didn't the audits disclose or find these issues with the developer's access to users' data?

ZUCKERBERG: Yes, Congressman, I'm — I'm aware of the audits that we do. We do audits every other year. They're ongoing. The audits have not found material issues with our privacy programs in place at the company.

I think the broader question here is — we have had this FTC consent decree, but we take a broader view of what our responsibility for people's privacy is.

And our — our view is that this — what a developer did — that they represented to us that they were going to use the data in a certain way, and then, in their own systems, went out and sold it — we do not believe is a violation of the consent decree. But it's clearly a breach of people's trust.

And the standard that we hold ourselves to is not just following the laws that are in place. But we also — we just want to take a broader view of this in protecting people's information.

LATTA: Let me — I'm about out of time here.

Are you aware that Facebook did provide the auditors with all the information they requested for — when doing the FTC audits?

ZUCKERBERG: Sorry, can you repeat that?

LATTA: Yeah. Did we — did Facebook provide the auditors with all the information it requested when they were preparing the audit for the FTC?

ZUCKERBERG: Congressman, I believe we do provide the audits to the FTC.

LATTA: Okay. So — but all the information is provided. And were you ever personally asked to provide information or feedback in these audits to the FTC?

ZUCKERBERG: Congressman, not personally, although I'm briefed on all of the audits by our team.

LATTA: Okay.

Mr. Chair, my time's expired and I yield back.

WALDEN: Gentleman yields back.

Chair recognizes the gentlelady from Illinois, Ms. Schakowsky, for four minutes.

REP. JAN SCHAKOWSKY (D-ILL.): Thank you, Mr. Chairman.

You know, you have a long history of growth and success, but you also have a long list of apologies. In 2003, it started at Harvard. “I apologize for any harm done as a result of my neglect.” 2006: “We really messed this one up.” 2007: “We simply did a bad job. I apologize for it.” 2010: “Sometimes we move too fast.” 2011: “I'm the first to admit that we're made — that we've made a bunch of mistakes.”

2017 — this is in — in connection with the Russian manipulation of the election and the data that was — came from Facebook initially: “I am — I ask for forgiveness. I will work to do better.” So it seems to me from this history that self-regulation — this is proof to me that self-regulation simply does not work.

I have a bill — the Secure and Protect Americans' Data Act — that I hope you will take a look at, very simple bill about setting standards for how you have to make sure that the data is protected, deadlines on when you have to release that information to the public. Certainly, it ought to go to the FTC, as well.

But, in response to the questions about the apps and the investigation that you're going to do, you said you don't necessarily know how long. Have you set any deadline for that? Because we know, as my colleague said, that there are tens of thousands — there's actually been 9 million apps. How long do we have to wait for that kind of investigation?

ZUCKERBERG: Congresswoman, we expect it to take many months.

SCHAKOWSKY: Years?

ZUCKERBERG: I hope not.

SCHAKOWSKY: Okay.

I want to ask you — yesterday — following up on your response to Senator Baldwin's question, you said yesterday that Kogan also sold data to other firms. You named Eunoia Technologies.

How many are there total? And what are their names? Can we get that? And how many are total — are there total?

ZUCKERBERG: Congresswoman, we can follow up with you to make sure you get all that information.

SCHAKOWSKY: Yeah, but order of magnitude?

ZUCKERBERG: I don't believe it was a large number. But, as we complete the audits we will know more.

SCHAKOWSKY: What's a large number?

ZUCKERBERG: A handful.

SCHAKOWSKY: Has Facebook tried to get those firms to delete user data and its derivatives?

ZUCKERBERG: Yes, Congresswoman. In 2015, when we first learned about it, we immediately demanded that the app developer and the firms that he sold it to delete the data. And they all represented to us that they had.

It wasn't until about a month ago that new reports surfaced that suggested that they hadn't, which is what has kicked off us needing to now go do this full audit and investigation and investigate all these other apps that have come up.

SCHAKOWSKY: And were derivatives deleted?

ZUCKERBERG: Congresswoman, we need to complete the investigation and audit before I can confirm that.

SCHAKOWSKY: You are looking at the ...

(CROSSTALK)

ZUCKERBERG: What they represented to us is that they have. But we need to now get into their systems and confirm that before I want to stand up here confidently and say what they've done.

SCHAKOWSKY: So Mr. Green asked about the General Data Protection Regulation on May 25th that's going to go into effect by the E.U. And your response was — let me ask: Is your response that exactly the protections that are guaranteed, not the — what did he say? Yeah, not just the controls, but all the rights that are guaranteed under the General Data Protection Regulations will be applied to Americans, as well?

ZUCKERBERG: Congresswoman, the GDPR has a bunch of different, important pieces. One is around offering controls over specific — over every use of people's data.

SCHAKOWSKY: Right, that's one. Yes.

ZUCKERBERG: That, we're doing.

The second is around pushing for affirmative consent and putting a control in front of people that walks people through their — their choices.

SCHAKOWSKY: Exactly.

ZUCKERBERG: We're going to do that too. The second — although that might be different, depending on the laws in specific countries and different places — but we're going to put a tool at the top of everyone's app that walks them through their settings and helps them understand what is going on.

SCHAKOWSKY: It sounds like it will not be exact. And let me say, as we look at the distribution of information ...

WALDEN: The gentlelady's time ...

SCHAKOWSKY: ... that who's going to protect us from Facebook is also a question.

Thank you. I yield back.

WALDEN: Gentlelady's time's expired.

Chair recognizes the gentlelady from Washington state, the conference chairman.

REP. CATHY MCMORRIS RODGERS (R-WASH.): Yeah, turn on the — thank you. And thank you, Mr. Zuckerberg, for joining us.

Today is clearly timely. There's a number of extremely important questions Americans have about Facebook, including questions about safety and security of their data, about the process by which their data is made available to third parties, about what Facebook is doing to protect consumer privacy as we move forward.

But one of the issues that is concerning me and I'd like to dig a little deeper into is how Facebook treats content on its platform. So, Mr. Zuckerberg, given the extensive reach of Facebook and its widespread use as a tool of public expression, do you think Facebook has a unique responsibility to ensure that it has clear standards regarding the censorship of content on its platform?

And do you think Facebook adequately and clearly defines what these standards are for its users?

ZUCKERBERG: Congresswoman, yes, I feel like we have a very important responsibility to outline what the content policies are and the community standards are.

This is one of the areas that, frankly, I'm worried we're not doing a good enough job at right now, especially because, as an American-based company where about 90 percent of the people in our community are outside of the U.S., where there are different social norms and different cultures, it's not clear to me that our current situation of how we define community standards is going to be effective for articulating that around the world.

So we're looking at different ways to evolve that, and I think that this is one of the more important things that we will do.

MCMORRIS RODGERS: Okay.

And, even focusing on content for here in America, I'd like to shift gears just a little bit and talk about Facebook's recent changes to its news feed algorithm.

Your head of news partnerships recently said that Facebook is, quote, “taking a step to define what quality news looks like and give that a boost so that, overall, there is a less — there is less competition from news.”

Can you tell me what she means by “less competition from news”? And also, how does Facebook objectively determine what is acceptable news and what safeguards exist to ensure that, say, religious or conservative content is treated fairly?

ZUCKERBERG: Yes, Congresswoman. I'm not sure specifically what that person was referring to, but I can walk you through what the algorithm change was, if that's useful.

MCMORRIS RODGERS: Well, maybe I'll just go on to my other questions, then.

There's an issue of content discrimination, and it's not a problem unique to Facebook. There's a number of high-profile examples of edge providers engaging in blocking and censoring religious and conservative political content.

In November, FCC Chairman Pai even said that edge providers routinely block or discriminate against content they don't like. This is obviously a serious allegation.

How would you respond to such an allegation? And what is Facebook doing to ensure that its users are being treated fairly and objectively by content reviewers?

ZUCKERBERG: Congresswoman, the principle that we're a platform for all ideas is something that I care very deeply about. I'm worried about bias, and we take a number of steps to make sure that none of the changes that we make are targeted at — in any kind of biased way.

And I'd be happy to follow up with you and go into more detail on that, because I agree that this is a serious issue.

MCMORRIS RODGERS: Over Easter, a Catholic university's ad with a picture of a historic San Damiano cross was rejected by Facebook. Though Facebook addressed the error within days, that it happened at all is deeply disturbing.

Could you tell me what was so shocking, sensational or excessively violent about the ad to cause it to be initially censored? Given that your company has since said that it did not violate terms of service, how can users know that their content is being viewed and judged accordingly — to objective standards?

ZUCKERBERG: Congresswoman, it sounds like we made a mistake there, and I apologize for that. And, unfortunately, with the amount of content in our systems and the current systems that we have in place to review, we make a relatively small percent of mistakes in content review. But that can be — that's — that's too many. And this is an area where we need to improve.

What I — what I will say is that I wouldn't extrapolate from a few examples, to assuming that the overall system is biased. I — I get how people can — can look at that and draw that conclusion, but I don't think that that reflects the — the way that we're trying to build the system or what we've seen.

WALDEN: Gentlelady's ...

MCMORRIS RODGERS: Thank you. And I — I just — this — this is — important issue in building trust.

ZUCKERBERG: I agree.

MCMORRIS RODGERS: And that is going to be important as we move forward.

Thank you, and I yield back.

WALDEN: Gentlelady's time is expired.

Chair recognizes the gentleman from North Carolina, Mr. Butterfield, for four minutes.

REP. G.K. BUTTERFIELD (D-N.C.): Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for your testimony here today.

Mr. Zuckerberg, you have stated that your goal with Facebook is to build strong communities. And, certainly, that sounds good. You've stated here today, on the record, that you did not live up to the privacy expectations. And I appreciate that.

But this committee — and you must know this — this committee is counting on you to right a wrong. And I hope you get it. In my opinion, Facebook is here to stay, and so you have an obligation to protect the data that you collect and the data that you use. And Congress has the power to regulate your industry, and we have the power to penalize misconduct.

But I want to go in a different direction today, sir. You and your team certainly know how I feel about racial diversity in corporate America. And Sheryl Sandberg and I talk about that all of the time.

Let me ask you this — and — and the Congressional Black Caucus has been very focused on — on holding your industry accountable — not just Facebook, your industry — accountable for increasing African American inclusion at all levels of the industry.

And I know you've — have a number of diversity initiatives. In 2017, you've increased you black representation from 2 percent to 3 percent. While this is a small increase, it's better than none. And this does not nearly meet the definition of building a racially diverse community.

CEO leadership — and I have found this to be absolutely true — CEO leadership on issues of diversity is the only way that the technology industry will change.

So will you commit, sir, to convene — personally convene a meeting of CEOs in — in your sectors, many of them — them — all of them, perhaps are your friends — and to do this very quickly to develop a strategy to increase racial diversity in the technology industry?

ZUCKERBERG: Congressman, I think that that's a good idea and we should follow up on it. From the conversations that I have with my fellow leaders in the tech industry, I — I know that this something that we all understand that the whole industry is behind on. And Facebook is certainly a big part of that issue.

And we care about this not just from the justice angle, but because we know that having diverse, different viewpoints is what will help us serve our community better, which is ultimately what we're here to do. And I think we know that the industry is behind on this and want to ...

(CROSSTALK)

BUTTERFIELD: Well, we've talked with you over the years about this. And, while there has been some marginal improvement, we — we must do better than we have done.

Recently, you appointed an African-American — our friend, Ken Chenault — to your board. And, of course, Erskine Bowles is already on your board, who is also a friend. But — but we've — we've got to concentrate more on board membership for African Americans, and also minorities at the entry level in — within your company.

I was looking at your website a few minutes ago, and it looks like you list five individuals as leadership in your company, but none of them is African American.

I was just looking at it — not only you and Sheryl, but David (sic), Mike and Chris — that is your leadership team. And this does not reflect America. Can you improve the numbers on your leadership team to be more diverse?

ZUCKERBERG: Congressman, this is an issue that we're — we're focused on. We have a broader leadership than just five people. I mean ...

BUTTERFIELD: Not on your website.

ZUCKERBERG: I understand that.

BUTTERFIELD: We can do better than that, Mr. Zuckerberg. We certainly can.

Do you plan to add an African-American to your leadership team in the foreseeable future? And will you commit that you will continue to work with us, the Congressional Black Caucus, to increase diversity within your company that you're so proud of?

ZUCKERBERG: Congressman, we will certainly work with you. This is an important issue.

BUTTERFIELD: We also find that companies' failure to retain black employees contributes to their low presence at technology companies. And there is little transparency in retention numbers.

So will you commit to providing numbers on your retention — that's the big word — retention of your employees, disaggregated by race, in your diversity update, starting this year? Can we get that data? That — that's — that's the starting point.

ZUCKERBERG: Congressman, we — we try to include a lot of important information in the diversity updates. I will go discuss that with my team after I get back from this hearing.

BUTTERFIELD: I'm out of time, sir. I'll take this up with your team in another setting.

(CROSSTALK)

BUTTERFIELD: We'll be out there in a few weeks. Thank you.

I yield back.

WALDEN: The gentleman's time has expired. Chair now recognizes the chairman of the Oversight and Investigations Subcommittee, gentleman from Mississippi, Mr. Harper, for four minutes.

(CROSSTALK)

REP. GREGG HARPER (R-MISS.): Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg for being here. And we don't lose sight of the fact that you're a great American success story. It is a part of everyone's life and business — sometimes, maybe too often. But I thank you for taking the time to be here.

And our concern is to make sure that it's — it's fair. We worry because we're — we're looking at possible government regulation here. Certainly, this self-governing, which has had some issues and how you factor that — and — and we — you know, we're trying to keep up with the algorithm changes on — on how you determine the prioritization of the news feeds.

And you look at, well, it's got to be — it needs to be trustworthy and reliable and relevant — well, who's going to determine that? That also has an impact. And, even though you say you don't want the bias, it does — it is dependent upon who's setting what those standards are in that.

And so I want to ask you a couple questions, if I may. And this is a quote from Paul Grewal, Facebook's V.P. and general counsel — said, “Like all app developers, Mr. Aleksandr Kogan requested and gained access to information from people after they chose to download his app.”

Now, under Facebook policy, in 2013, if Cambridge Analytica had developed the This is Your Digital Life app, they would have had access to the same data they purchased from Mr. Kogan. Would that be correct?

ZUCKERBERG: Congressman, that's correct. And a different developer could have built that app.

HARPER: Okay. Now according to PolitiFact.com, and this is a quote, “The Obama campaign and Cambridge Analytica both gained access to huge amounts of information about Facebook users and their friends, and in neither case did the friends of app users consent,” close quote.

This data that Cambridge Analytica acquired was used to target voters with political messages, much as the same type of data was used by the Obama campaign to target voters in 2012. Would that be correct?

ZUCKERBERG: Congressman, the big difference between these cases is that, in — in the Kogan case, people signed into that app expecting to share the data with Kogan, and then he turned around and, in violation of our policies and in violation of people's expectations, sold it to a third-party firm — to Cambridge Analytica, in this case.

HARPER: Sure.

ZUCKERBERG: I — I think that we — we were very clear about how the platform worked at the time — that anyone could sign into an app and they'd be able to bring their information, if they wanted, and some information from their friends.

People had control over that. So, if you wanted, you could — you could turn off the ability to sign into apps, or turn off the ability for your friends to be able to bring your information. The platform worked the way that we had designed it at the time.

I think we now know that we should have a more restrictive platform where people cannot also bring information from their friends, and can only bring their own information. But that's the way that system worked at the time.

HARPER: And — and, whether in violation of the agreement or not, you — you agree that users have an expectation that their information would be protected and remained private, and not be sold.

And so that's something — the — the reason that we're here today. You know, and I can certainly understand the general public's outrage if they're concerned regarding the way Cambridge Analytica required their information.

But, if people are outraged because they use that for political reasons, would that be hypocritical? Shouldn't they be equally outraged that the Obama campaign used the — the data of Facebook users without their consent in 2012?

ZUCKERBERG: Congressman, what I think people are — are rightfully very upset about is that an app developer that people had shared data with sold it to someone else and, frankly, we didn't do enough to prevent that or understand it soon enough.

HARPER: Thank you.

ZUCKERBERG: And now we have to go through and — and put in place systems that prevent that from happening again and — making sure that we have sufficient controls in place in our ecosystem so, that way, developers can't abuse people's data.

HARPER: Thank you, Mr. Zuckerberg.

My time is expired — yield back.

WALDEN: Gentleman yields back the balance of his time.

Gentlelady from California, Ms. Matsui, is recognized for four minutes.

REP. DORIS MATSUI (D-CALIF.): Thank you, Mr. Chairman, and welcome, Mr. Zuckerberg. Thank you very much here.

You know, I was just thinking about Facebook and how you developed your platform — first, from a social platform with — amongst friends and colleagues and joining a community. And a lot of that was based upon trust, because you knew your friends, right?

But that evolved into this business platform, and one of the pillars still was trust. And I think you would all — I think everybody here would agree that trust is in short supply here, and that's why we're here today.

Now, you've constantly maintained that consumers own the data they provided to Facebook and should have control over it. And I appreciate that, and I just want to understand more about what that means.

To me, if you own something, you ought to have to — say about how and when it's used. But, to be clear, I don't just mean pictures, email addresses, Facebook groups or pages.

I understand the data and information consumers provided to Facebook can be, and perhaps is, used by algorithms to form assumptions and inferences about users to better target ads to the individuals.

Now, do you believe that consumers actually own their data, even when that data has been supplemented by a data broker — assumptions algorithms have made about that user or otherwise?

And this is kind of the question that Ms. Blackburn has come up with — our own comprehensive profile, which is kind of our virtual self.

ZUCKERBERG: Congresswoman, I — I believe that people own all of their own content. Where this gets complicated is — let's say I take a photo and I share it with you. Now, is that my photo, or is it your photo?

I — I would take the position that it's our photo, which is why we make it so that you can bring — it's — that I can bring that — that photo to another app, if I want, but you can't.

MATSUI: But, once it gets to the data broker, though — so there are certain algorithms and certain assumptions made. What happens after that?

ZUCKERBERG: Sorry. Can you clarify that?

MATSUI: Well, what I mean is — is that, if you supplement this data — you know, you say you're owning it, but you supplement this — when other data brokers, you know, use their own algorithms to supplement this and make their own assumptions, then what happens there? Because that is — to me, somebody else is taking that over. How can you say that we own that data?

ZUCKERBERG: Congresswoman, all the data that you put in, all the content that you share on Facebook is yours. You control how it's used. You can remove it at any time. You can get rid of your account and get rid of all of it at once. You can ...

(CROSSTALK)

MATSUI: So — but you can't claw it back once it gets out there, right? I mean, that's really — we might own our own data, but, once it's used in advertising, we lose control over it. Is that not right?

ZUCKERBERG: Congresswoman, I — I disagree with that, because one core tenet of our advertising system is that we don't sell data to advertisers. Advertisers don't get access to your data.

There's a — there's a core misunderstanding about how that system works, which is that — let's say if you're — if you're a shop, and you're selling muffins, right, it's — you might want to target people in a specific town who might be interested in baking, or — or some demographic.

But we don't send that information to you. We just show the message to the right people. And that's a really important, I think, common misunderstanding ...

MATSUI: Yeah. I understand that.

ZUCKERBERG: ... about how this system works.

MATSUI: But Facebook sells ads based at least on part of data users provide to Facebook. That's right. And the more data that Facebook collects — allows you to better target ads to users or classes of users.

So, even if Facebook doesn't earn money from selling data, doesn't Facebook earn money from advertising based on that data?

ZUCKERBERG: Yes, Congresswoman, we run ads. That's the — the business model is running ads. And we use the data that people put into the system in order to make the ads more relevant, which also makes them more valuable.

But it's — what we hear from people is that, if they're going to see ads, they want them to be good and relevant ...

(CROSSTALK)

MATSUI: But we're not controlling that data.

ZUCKERBERG: No, you have complete control over that.

WALDEN: The gentlelady's time is expired.

As previously agreed, we will now take a five-minute recess, and committee members and — and our witness need to plan to be back in about five minutes. We stand in recess.

(RECESS)

WALDEN: We'll call the Energy and Commerce Committee back to order and recognize the gentleman from New Jersey, Mr. Lance, for four minutes for purposes of questions.

REP. LEONARD LANCE (R-N.J.): Thank you very much, Mr. Chairman.

Mr. Zuckerberg, you are here today because you are the face of Facebook, and you have come here voluntarily. And our questions are based upon our concern about what has occurred and how to move forward.

I'm sure you have concluded, based upon what we've asked, that we are deeply offended by censoring of content inappropriately by Facebook. It — examples have been raised: a Roman Catholic university, a state senate candidate in Michigan.

I would be offended if this censoring were occurring on the left, as well as the right, and I want you to know that. And do you take from what we have indicated so far that, in a bipartisan fashion, Congress is offended by inappropriate censoring of content?

ZUCKERBERG: Congressman, yes. This is extremely important. And I think the — the point that you raise is particularly important — that we've heard in — today a number of examples of — where we may have made content review mistakes on conservative content. But I can assure you that there are a lot of folks who think that we make content moderation or content review mistakes of liberal content, as well.

LANCE: Fair enough. My point is that we don't favor censoring in any way, so long as it doesn't involve hate speech or violence or terrorism. And, of course, the examples today indicate quite the contrary, number one.

Number two, Congresswoman Blackburn has mentioned her legislation. I'm a co-sponsor of the BROWSER legislation. I commend it to your attention, to the attention of your company. It is for the entire ecosystem. It is for ISPs and edge providers. It is not just for one or the other.

It is an opt-in system, similar to the system that exists in your — might I respectfully request of you, Mr. Zuckerberg, that you and your company review the BROWSER legislation? And I would like your support for that legislation after your review of it.

ZUCKERBERG: We will review it and get back to you.

LANCE: Thank you very much.

Your COO, Sheryl Sandberg, last week, appeared on the Today program. And she admitted the possibility that additional breaches in personal information could be discovered by the current audits.

Quote, “We're doing an investigation. We're going to do the audits. And, yes, we think it's possible. That's why we're doing the audits.” Then the COO went on to say, “Facebook cared about privacy all along, but I think we got the balance wrong.” Do you agree with the statement of your COO?

ZUCKERBERG: Yes, Congressman, I do. We were trying to balance two equities: on the one hand, making it so that people had data portability, the ability to bring their data to another app in order to have new experiences in other places, which I think is a value that we all care about.

On the other hand, we also need to balance making sure that everyone's information is protected. And I think that we — we didn't get that balance right up front.

LANCE: Thank you. I — I certainly concur with the statement of the COO, as affirmed by you today, that you got the balance wrong.

And then, regarding Cambridge Analytica, the fact that 300,000 individuals or so gave consent, but that certainly didn't mean they gave consent to — to 87 million friends — do you believe that that action violated your consent agreement with the Federal Trade Commission?

ZUCKERBERG: We do not believe it did. But, regardless, we take a broader view of what our responsibility is to protect people's privacy. And, if a developer who people gave their information to — in this case, Aleksandr Kogan — then goes and, in violation of — of his agreement with us, sells the data to Cambridge Analytica, that's a big issue.

And I think people have a right to be very upset. I'm upset that that happened. And we need to make sure that we put in place the systems to prevent that from happening again.

LANCE: Thank you. I think you may have violated the agreement with the Federal Trade Commission, and I'm sure that will be determined in the future. Thank you, Mr. Chairman.

WALDEN: Thank the gentleman from New Jersey, recognize the gentlelady from Florida, Ms. Castor, for four minutes.

REP. KATHY CASTOR (D-FLA.): Thank you, Mr. Chairman.

Welcome, Mr. Zuckerberg.

For all of the benefits that Facebook has provided in building communities and connecting families, I think a devil's bargain has been struck.

And, in the end, Americans do not like to be manipulated. They do not like to be spied on. We don't like it when someone is outside of our home, watching. We don't like it when someone is following us around the neighborhood or, even worse, following our kids or stalking our children.

Facebook now has evolved to a place where you are tracking everyone. You are collecting data on just about everybody. Yes, we understand the Facebook users that — that proactively sign in, they're in part of the — that platform, but you're following Facebook users even after they log off of that platform and application, and you are collecting personal information on people who do not even have Facebook accounts. Isn't that right?

ZUCKERBERG: Congresswoman, I believe that we ...

CASTOR: Yes or no?

ZUCKERBERG: Congresswoman, I — I'm not sure — I don't think that that's what we're tracking.

CASTOR: No, you're collecting — you have already acknowledged that you are doing that for security purposes, and commercial purposes. So you are — you're collecting data outside of Facebook. When someone goes to a website, and it has the Facebook like or share, that data is being collected by Facebook, correct?

ZUCKERBERG: Congresswoman ...

CASTOR: Yes or no.

ZUCKERBERG: That's right, that we — that we understand, in order to show which of your friends liked a page ...

CASTOR: Yeah, so for people who don't even have Facebook — I don't think that the average American really understands that today, something that fundamental, and that you're tracking everyone's online activities. Their searches, you can track what people buy, correct?

ZUCKERBERG: Congressman — Congresswoman ...

CASTOR: You're collecting that data, what people purchase online, yes or no?

ZUCKERBERG: I — I — I actually — if they share it with us. But Congresswoman, overall, I — I'm ...

CASTOR: Because it has a share button, so it's — it's — it's gathering. Facebook has the application. In fact, you've patented applications to do just that, isn't that correct? To collect that data?

ZUCKERBERG: Congresswoman, I don't think any of those buttons share transaction data. But broadly, I — I disagree with the characterization.

(CROSSTALK)

CASTOR: But they — they track you. You want — you're collecting medical data, correct, on — on people that — that are on the Internet, whether they're Facebook users or not, right?

ZUCKERBERG: Congresswoman, yes, we collect some data for security purposes, and ...

(CROSSTALK)

CASTOR: And you're collecting — you watch where we go. Senator Durbin had a — had a funny question yesterday about where you're staying, and you didn't want to share that, but you — Facebook also gathers that data about where we travel, isn't that correct?

ZUCKERBERG: Congresswoman, everyone has control over how that works.

CASTOR: I'm going to get to that, but yes, you are — would you just acknowledge if yes, Facebook is — that's the business you're in, gathering data and aggregating that data, right?

ZUCKERBERG: Congresswoman, I disagree with that characterization.

CASTOR: You're not — are you saying you do not gather data on — on where people travel, based upon their Internet, and the — the ways they sign in, and things like that?

ZUCKERBERG: Congresswoman, the primary way that Facebook works is that people choose to share data, and they share content because they're trying to communicate.

(CROSSTALK)

CASTOR: Primary, but the — the other way that Facebook gathers data is you buy data from data brokers, outside of the platform, correct?

ZUCKERBERG: Congresswoman, we just announced two weeks ago that we were going to stop interacting with data brokers, and even though that's an industry norm, to make it so that the advertising can be more relevant ...

CASTOR: But I think in the end, I think what — see, it's — it's practically impossible these days to remain untracked in America. For all the benefits Facebook has brought, and — and the Internet, and that's not part of the bargain. And current laws have not evolved, and the Congress has not adopted, laws to — to address digital surveillance, and Congress should act. And I do not believe that the controls, the opaque agreement, consent agreements and settings are an adequate substitute for fundamental privacy protections for consumers.

Now some ...

WALDEN: The gentle — the gentlelady's time.

CASTOR: Thank you. I yield back my time.

WALDEN: The gentlelady's time ...

CASTOR: Let that stand. And I'd like to ask unanimous consent that I put my constituents' questions in the record.

WALDEN: Without objection.

CASTOR: Thank you.

WALDEN: Chair now recognizes the gentlemen from Kentucky, Mr. Guthrie, for (inaudible) minutes.

REP. BRETT GUTHRIE (R-KY.): Thank you, Mr. Chairman. Thanks for being here.

When I first got into public office, the Internet was really kicking off, and I had a lot of people complain about ads, just the inconvenience of ads, trying to get the — and the cumbersome of the Internet. I remember telling someone one time, being from Kentucky, a basketball fan. I said “There's nothing I hate worse than the four-minute timeout, the TV timeout. It's flow of the game, and everything. But because of the four-minute timeout, I get to watch the game for free, so that's something I'm willing to accept to move for free.

What you're not really willing to accept is that your data's just out there, and it — it's being used. But it's being used in the — in the right way, and it's — it's funny, because I was going to ask this question anyway. My — my friend and I was planning a family trip to Florida, and I searched a town in Florida, and all of a sudden, I started getting ads for a brand of hotel that I typically stay in, and a great hotel at the price available to the public, because it was on the Internet, that I was willing to pay and stay there. So I thought it was actually convenient. Instead of getting just an ad to someplace I'll never go, I got an ad specifically to a place I was — I was looking to go, so I thought that was convenient. And it wasn't Facebook, although my wife used Facebook to message my mother-in-law this weekend for where we're meeting up, so it's very valuable. We get to do that for free, because your business model relies on consumer-driven data.

This wasn't Facebook. It was a search engine, but they use consumer — consumer-driven data to target an ad to me, so you're not unique in Silicon Valley, or in this Internet world in doing this type of targeted ads, are you?

ZUCKERBERG: No, Congressman. You're — you're right. I mean, this is ad-based business models have been a common way that people have been able to offer free services for a long time. And our social mission of trying to help connect everyone in the world relies on having a service that can be affordable for everyone; that everyone can use. And that's why the ads business model is in service of the social mission that we have, and you know, I think sometimes that gets lost, but I think that's a really important point.

GUTHRIE: But — but you're different in that instead of getting just a broad — When I'm watching the — the Hilltoppers on basketball, the person advertising me doesn't know anything about me. I'm just watching the ad, so there's no data, no agreement, or no risk, I guess, there.

But with you, there — there is consumer-driven data. But if we were to greatly reduce or stop — or just greatly reduce, through legislation, the use of consumer-driven data for targeting ads, what do you think that would do to the Internet, just — and when I say Internet, I mean everything, not just Facebook.

ZUCKERBERG: Well, Congressman, it would make the ads less relevant. So what we ...

GUTHRIE: So if you had less revenue, what would that do to ...

ZUCKERBERG: And — yeah. It would — it would reduce — it would have a number of effects. For people using the services, it would make the ads less relevant to them. For businesses, like the small businesses that use advertising, it would make advertising more expensive, because now they would have to reach — they would have to pay more to reach more people, and efficiently, because targeting helps small businesses be able to afford and — and reach — and reach people as effectively as big companies have typically had the ability to do for a long time.

It would affect our revenue some amount too, but I think one — there are a couple of points here that are lost. One is that we already give people a control to not use that data and ads, if they want. Most people don't do that. I think part of the reason for that is that people get that if they are going to see ads, that they want them to be relevant.

But the other thing is that our — a lot of what our business — what makes the ads work, or what makes the business good is just that people are very engaged with Facebook. We have more than a billion people who spend almost an hour a day across all our services.

GUTHRIE: I have 30 seconds, so I appreciate the answer to that. But if — so — so I didn't opt out, and so forth, and all of a sudden, I say, “You know, this just doesn't work for me, so I want to delete — " You told Congressman Rush that you could delete. What happens to the data? I — I've already — it's fair. It's been used. It's — Cambridge Analytics may have it. So what happens when I say, “Facebook, take my data off your platform”?

ZUCKERBERG: If you delete your account, we immediately make it so that your account is — is no longer available, once you're — once you're done deleting it. So no one can find you on the service. We wouldn't be able to re-create your account from that.

We do have data centers and systems that are redundant, and we have backups in case something bad happens. And, over a number of days, we'll — we'll go through and make sure that we flush all the content out of the system.

But, as soon as you delete your account, effectively, that content is — is dismantled and we wouldn't be able to put your account back together if we wanted to.

WALDEN: Gentleman's time ...

GUTHRIE: Thank you. My time's expired. I appreciate it.

WALDEN: Recognize the gentleman from Maryland, Mr. Sarbanes, for four minutes.

REP. JOHN SARBANES (D-MD.): Thank you, Mr. Chairman. Good morning, Mr. Zuckerberg.

I wanted to get something in the record quickly, before I move to some questions. You had suggested in your testimony over the last couple of days that Facebook notified the Trump and Clinton campaigns of Russian attempts to hack in to those campaigns.

But representatives of both campaigns, in the last 24 hours, have said that didn't happen. So we're going to follow up on that and find out what the real story is.

ZUCKERBERG: Do you want me to ...

SARBANES: No, I'd like — I'd like to move on. You can provide a response to that in writing, if you would.

Let me ask you, is it true that Facebook offered to provide what I guess you referred to as “dedicated campaign embeds” to both of the presidential campaigns?

ZUCKERBERG: Congressman, I can quickly respond to the first point, too.

(CROSSTALK)

SARBANES: Just say yes or no, were there embeds ...

(CROSSTALK)

SARBANES: ... I need to get to that because I don't have time. Were there embeds in the two campaigns, or offers of embeds?

ZUCKERBERG: Congressman ...

SARBANES: Yes or no.

ZUCKERBERG: ... we ...

SARBANES: Were there embeds offered to the Trump campaign and the Clinton campaign?

ZUCKERBERG: We offer sales support to every campaign.

SARBANES: Okay. So sales support — I'm going to refer to that as embeds. And I gather that Mr. Trump's campaign ultimately accepted that offer. Is that correct? Yes or no.

ZUCKERBERG: Congressman, the — the Trump campaign had sales support ...

SARBANES: Okay. So they had embeds.

(CROSSTALK)

SARBANES: I'm going to refer to those as embeds.

What I'd like you to do, if you could — we're not going to have time for you to do this now — but, if you could provide to the committee both the initial offer terms, and then any subsequent offer terms that were presented to each candidate, in terms of what the embed services would be, that would be very helpful.

Do you know how many ads were approved for display on Facebook for each of the presidential candidates — by Facebook?

ZUCKERBERG: Congressman, I do not, sitting here off the top of my head.

SARBANES: Okay. Let me tell you what they were, because I do. President Trump's campaign had an estimated 5.9 million ads approved, and Secretary Clinton, 66,000 ads.

So that's a delta of about 90 times as much on the Trump campaign, which raises some questions about whether the ad approval processes were maybe not processed correctly or inappropriately bypassed in the final months and weeks of the election by the Trump campaign. And what I'm worried about is that the embeds may have helped to facilitate that.

Can you say with absolute certainty that Facebook or any of the Facebook employees working as campaign embeds did not grant any special approval rights to the Trump campaign to allow them to upload a very large number of Facebook ads in that final stretch?

ZUCKERBERG:

Congressman, we apply the same standard to all campaigns.

SARBANES: Can you say that there were not special approval rights granted? Is that what you're saying — there were not special approval rights granted by any of the embeds — or support folks, as you call them — in that Trump campaign?

ZUCKERBERG: Congressman ...

SARBANES: Yes or no.

ZUCKERBERG: ... what I'm — yes. What I'm saying is that ...

SARBANES: Okay. All right. If you're saying yes ...

(CROSSTALK)

ZUCKERBERG: ... following the same standards.

SARBANES: ... if you're saying yes, then I'll take you at your word.

The reason this is important and the reason we need to get to the bottom of it is because it could be a serious problem if these kinds of services were provided beyond what is offered in the normal course, because that could result in violation of campaign finance law, because it would be construed as an in-kind contribution — corporate contribution from Facebook, beyond what — the sort of ad-buy opportunity you would typically provide.

The reason I'm asking you these questions is because I'm worried that that embed program has the potential to become a tool for Facebook to solicit — solicit favor from policymakers, and that, then, creates the potential for real conflict of interest.

And I think a lot of Americans are waking up to the fact that Facebook is becoming sort of a self-regulated superstructure for political discourse. And the question is, are we, the people, going to regulate our political dialogue? Or are you, Mark Zuckerberg, going to end up regulating the political discourse?

WALDEN: Gentleman's time ...

SARBANES: So we need to be free of that undue influence.

I thank you for being here ...

WALDEN: ... gentleman's time's expired.

SARBANES: ... and I yield back my time.

WALDEN: Chair recognizes the gentleman from Texas, Mr. Olson, for four minutes.

ZUCKERBERG: Mr. Chairman, do you mind, for the record, if I just answer the first point for — for ...

WALDEN: That's fine.

ZUCKERBERG: ... take 10 seconds.

WALDEN: Go ahead.

ZUCKERBERG: When I was referring to the campaigns yesterday, I meant the DNC and RNC. So I may have misspoken, and maybe, technically, that's called the committees. But that — those were the folks who I was referring to.

WALDEN: Thank you for that clarification.

We'll now go to Mr. Olson from Texas for four minutes.

REP. PETE OLSON (R-TEX.): I thank the chair. And, Mr. Zuckerberg, I know we both wish we met under a different set of circumstances.

When the story broke, you were quoted as saying, “I started Facebook. I run it. I'm responsible for what happens here,” end quote. You said those same words in your opening statement an hour and a half ago.

I know you believe that in your heart. It's not just some talking point, some canned speech, because, my four years — five — I'm sorry, nine years in the Navy, I know the best commanding officers, the best skippers, the best CEOs have that exact same attitude.

If Facebook was a Navy ship, your privacy has taken a direct hit. Your trust is severely damaged. You're taking on water and your future may be a fine with a number, per The Washington Post, with four commas in it.

Today, over $1 billion in fines coming your way. As you know, you have to reinforce your words with actions. I have a few questions about some anomalies that have happened in the past.

First of all, back in 2012, apparently, Facebook did an experiment on 689,003 Facebook users. You reduced positive posts from users' friends and limited so-called “downer” posts from other friends. They see — fed positive information to one group, and, another group, negative information.

The goal was to see how the tone of these posts would affect behavior. I look at this Forbes article, the L.A. Times, about un-legal — illegal human experimentation without permission. I want to talk about that.

It seems that this is disconnecting people, in stark contrast to your mission to connect people. Explain to us how you guys thought this idea was a good idea — experimenting with people, giving them more negative information, positive information.

ZUCKERBERG: Well, Congressman, I view our responsibility as not just building services that people like to use, but making sure that those services are also good for people and good for society overall.

At the time, there were a number of questions about whether people seeing content that was either positive or negative on social networks was affecting their mood.

And we felt like we had a responsibility to understand whether that was the case, because we don't want to have that effect, right? We don't want to have it so that — we want use of social media and our products to be good for people's well-being.

I mean, we continually make changes to — to that effect, including, just recently, this year, we did a number of research projects that showed that when social media is used for building relationships — and so when you're interacting with people, it's associated with a lot of positive effects of — of well-being that you'd expect. It — it makes you feel more connected, less lonely, it correlates with long term measures of happiness and health.

Whereas if you're using social media or the Internet just to passively consume content, then that doesn't have those same positive effects or can even be negative. So we've tried to shift the product more towards helping people interact with friends and family as a result of that. So that's the kind of — an example of the kind of work that we — that we do.

OLSON: One last question. I believe I've heard you employ 27,000 people thereabouts. Is that correct?

ZUCKERBERG: Yes.

OLSON: I've also been told that about 20,000 of those people, including contractors, do work on data security. Is that correct?

ZUCKERBERG: Yes. The 27,000 number is full time employees. And the security and content review includes contractors, of which there are tens of thousands. Or will be. Will be by the time that we hire those.

(CROSSTALK)

OLSON: Okay, so roughly at least half your employees are dedicated to security practices. How can Cambridge Analytica happen with so much of your workforce dedicated to these — these causes. How'd that happen?

ZUCKERBERG: Well, Congressman, the — the issue with Cambridge Analytica and Alexander Kogan happened before we ramped those programs up dramatically. But one thing that I think is important to understand overall is just the sheer volume of content on Facebook makes it so that we can't — no amount of people that we can hire will be enough to review all of the content.

We need to rely on and build sophisticated A.I. tools that can help us flag certain content. And we're getting good in certain areas. One of the areas that I mentioned earlier was terrorist content, for example, where we now have A.I. systems that can identify and — and take down 99 percent of the al-Qaeda and ISIS-related content in our system before someone — a human even flags it to us. I think we need to do more of that.

WALDEN: Gentleman's time is expired.

Chair recognizes the gentleman from California, Mr. McNerney for four minutes.

REP. MCNERNEY (D-CALIF.): I thank the Chairman. Mr. Zuckerberg, I — I thank you for agreeing to testify before the House and Senate committees. I know it's a long, grueling process and I appreciate your cooperation. I'm a mathematician that spent 20 years in industry and government, developing technology including algorithms. Moreover, my constituents are impacted by these issues. So I'm deeply committed and invested here.

I'm going to follow up on an earlier question. Is there currently a place that I can download all of the Facebook information about me, including the websites that I have visited?

ZUCKERBERG: Yes, Congressman. We have a “download your information” tool. We've had it for years. You can go to it in your settings and download all of the content that you have on Facebook.

MCNERNEY: Well, my staff just this morning downloaded their information and their browsing history is not in there. So are you saying that Facebook does not have browsing history?

ZUCKERBERG: Congressman, that would be correct. If — if we don't have content in there, then that means that — that you don't have it on Facebook. Or you haven't put it there.

MCNERNEY: So I'm — I'm — I'm not quite on board with this. Is there any other information that Facebook has obtained about me, whether Facebook collected it or obtained it from a third party that would not be included in the download?

ZUCKERBERG: Congressman, my understanding is that all of your information is included in your “download your information.”

MCNERNEY: Okay, I'm going to follow up with this afterwards.

Mr. Zuckerberg, you indicated that the European users with have GDPR protection on May 25th, and the American users will have those similar protections. When will the American users have those protections?

ZUCKERBERG: Congressman, we're working on doing that as quickly as possible. I don't have the exact date yet.

MCNERNEY: So it will not be on May 25th?

ZUCKERBERG: We're working on it.

MCNERNEY: Thank you.

Your company and many companies with an online presence have a staggering about of personal information. The customer is not really in the driver's seat about how their information is used or monetized. The data collectors are in the driver seat. Today, Facebook is governed by weak federal privacy protections. I've introduced legislation that would help address this issue.

The My Data Act would give the FTC rulemaking authority to provide consumers with strong data privacy and security protections. Without this kind of legislation, how can we be sure that Facebook won't continue to be careless with users' information?

ZUCKERBERG: Well, Congressman, let me first just set aside that my position isn't that there should be no regulation.

MCNERNEY: Correct.

ZUCKERBERG: But regardless of what the laws are that are in place, we have a very strong incentive to protect people's information. This is the core thing that Facebook is, is about 100 billion times a day people come to our service to share a photo or share a message or ...

(CROSSTALK)

MCNERNEY: Well, I mean I hear — I hear — I hear you saying this, but the history isn't there. So I — I think we need to make sure that there's regulations in place to give you the proper motivation to — to stay in line with data protection. One of the problems here in my mind is that Facebook's history, the privacy — user privacy and security have not been given as high priority as corporate growth. And you've admitted as much.

Is Facebook considering changing it's management structure to ensure that privacy and security have sufficient priority to prevent these problems in the future?

ZUCKERBERG: Congressman, this is an incredibly high priority for us. What I was saying before, that the core use of the product every day, about 100 billion times, is that people come and try to share something with a specific set of people. That works because people have confidence that if they send a message, it's going to go to the person that they want. If they want to share a photo with their friends, it's going to go to the people they want. That's incredibly important. We've built a — a robust privacy program. We have a chief privacy officer ...

MCNERNEY: That's a — that's a little bit off — off track from what I'm trying to get at. The privacy protections clearly failed in a couple of cases that are high profile right now. And part of the blame that — that seems to be out there is that the management structure for privacy and security don't have the right level of — of profile in — in Facebook to get your attention to make sure that they get the proper resources.

WALDEN: Gentleman's time — gentleman's time is expired.

Chair recognizes the gentleman from West Virginia, Mr. McKinley, for four minutes.

REP. DAVID B. MCKINLEY (R-W.VA.): Thank you for coming, Mr. Zuckerberg.

I've got a yes or no question, if you could give that. Should Facebook — should Facebook enable illegal online pharmacies to sell drugs such as Oxycodone, Percocet, Vicodin without a prescription?

ZUCKERBERG: Congressman, I believe ...

MCKINLEY: That's — that's a yes — yes or no. Do you think you should be able to do —

ZUCKERBERG: No, of course not.

MCKINLEY: And — there — there are 35,000 online pharmacies operating, and according to the FDA, they think there may be 96 percent of them are operating illegally. And on November of last year, CNBC had an article say that you were surprised by the breadth of this opioids crisis. And as you can see from these photographs, opioids are still available on your site, that they're — without a prescription on your site. So contradicts just what you just said, just a minute ago.

And — and when on last week, FDA Commissioner Scott Gottlieb has testified before our office, said that the Internet firms simply aren't taking practical steps to find and remove these illegal opioids listings. And he specifically mentioned Facebook. Are you aware of that, his quote?

ZUCKERBERG: Congressman, I'm not ...

MCKINLEY: Answer yes or no ...

(CROSSTALK)

ZUCKERBERG: ... aware of his quote, but I heard that he — that he said something. And let me just speak to this for a second ...

(CROSSTALK)

MCKINLEY: If I could — no, we don't — so, in your opening statement — and I appreciated your remark — you said, “It's not enough to give people a voice. We have to make sure that people aren't using it” — Facebook — “to hurt people.”

Now, America's in the midst of one of the worst epidemics that it's ever experienced, with this — with this drug epidemic. And it's all across this country; it's not just in West Virginia.

But your platform is still being used to circumvent the law and allow people to buy highly addictive drugs without a prescription. With all due respect, Facebook is actually enabling an illegal activity, and in so doing, you are hurting people. Would you agree with that statement?

ZUCKERBERG: Congressman, I think that there are a number of areas of content that we need to do a better job policing on our service.

Today, the primary way that content (inaudible) — regulation works here, and review, is that people can share what they want openly on the service, and then, if someone sees an issue, they can flag it to us, and then we will review it.

Over time, we're shifting to a mode where ...

(CROSSTALK)

MCKINLEY: You can — you can find out, Mr. Zuckerberg. You know which pharmacies are operating legally and illegally. But you're still continuing to take that — allow that to be posted on — on Facebook and allow people to get this — this scourge that's ravaging this country — is being enabled because of Facebook.

So my question to you, as we close, on this — you've said before you were going to take down those ads, but you didn't do it. We've got statement after statement about things — you're going to take those down within days, and they haven't gone down.

That, what I just put up, was just from yesterday. It's still up. So my question to you is, when are you going to stop — take down these posts that are done — on — with illegal digital pharmacies? When are you going to take them down?

ZUCKERBERG: Congressman, right now, when people report the posts to us, we will take them down and have people ...

MCKINLEY: Why do they have to — if you got all these 20,000 people — you know that they're up there. Where is your require — where is your accountability to allow this to be occurring — this — ravaging this country?

ZUCKERBERG: Congressman, I agree that this is a terrible issue, and, respectfully, when there are tens of billions or 100 billion pieces of content that are shared every day, even 20,000 people reviewing it can't look at everything.

What we need to do is build more A.I. tools that can proactively find that content.

MCKINLEY: If — you have been — said before you were going to take them down, and you haven't. And they're still up.

WALDEN: Gentleman's time has expired.

Chair recognizes the gentleman from Vermont, Mr. Welch, for four minutes.

REP. PETER WELCH (D-VT.): Thank you, Mr. Chairman.

Mr. Zuckerberg, you acknowledge candidly that Facebook made a mistake. You did an analysis of how it happened. You've promised action. We're at the point where the action will speak much louder than the words.

But, Mr. Chairman, this Congress has made a mistake. This event that happened, whether it was Facebook or some other platform, was foreseeable and inevitable. And we did nothing about it.

Congresswoman Blackburn and I had a — a group, a privacy working group, six meetings with many of the industry players. There was an acknowledgment on both sides that privacy was not being protected, that there was no reasonable safeguard for Americans' privacy. But there was an inability to come to a conclusion.

So we also have an obligation. And, in an effort to move forward, Mr. Zuckerberg, I've framed some questions that hopefully will allow a reasonable yes or no answer to see if there's some common ground to achieve the goal you assert you have, and we certainly have: the obligation to protect the privacy of American consumers.

First, do you believe that consumers have a right to know and control what personal data companies collect from them?

ZUCKERBERG: Yes.

WELCH: Do you believe that consumers have a right to control how and with whom their personal information is shared with third parties?

ZUCKERBERG: Congressman, yes, of course.

WELCH: And do you believe that consumers have a right to secure and responsible handling of their personal data?

ZUCKERBERG: Yes, Congressman.

WELCH: And do you believe that consumers should be able to easily place limits on the personal data that companies collect and retain?

ZUCKERBERG: Congressman, that seems like a reasonable principle to me.

WELCH: Okay. And do you believe that consumers should be able to correct or delete inaccurate personal data that companies have obtained?

ZUCKERBERG: Congressman, that one might be more interesting to debate, because ...

WELCH: Well, then, let's get — you get back to us with specifics on that. I think they do have that right.

Do you believe that consumers should be able to have their data deleted immediately from Facebook when they stop using the service?

ZUCKERBERG: Yes, Congressman, and they have that ability.

WELCH: Good.

And do you believe that the Federal Trade Commission, or another properly resourced governmental agency with rulemaking authority, should be able to determine on a regular basis what is considered personal information, to provide certainty for consumers and companies what information needs to be protected most tightly?

ZUCKERBERG: Congressman, I certainly think that that's an area where we should discuss some sort of oversight.

WELCH: There's not a big discussion here. Who gets the final say? Is it the private market companies, like yours? Or is there a governmental function here that defines what privacy is?

ZUCKERBERG: Congressman, I think that's — this is an area where some regulation makes sense. You proposed a very specific thing, and I think the details matter.

WELCH: All right. Let me ask you this. I've appreciated your testimony.

Will you work with this committee to help put us — to help the U.S. put in place our own privacy regulation that private — prioritizes consumer's right to privacy, just as the E.U. has done?

ZUCKERBERG: Congressman, yes, and I'll make sure that we work with — with you to flesh this out.

WELCH: All right.

And you have indicated that Facebook has not always protected the privacy of their users throughout the company's history. And it seems, though, from your answers, that consumers — you agree that consumers do have a fundamental right to privacy that empowers them to control the collection, the use, the sharing of their personal information online.

And, Mr. Chairman — and thank you. Mr. Chairman, privacy cannot be based just on company policies, whether it's Facebook or any other company. There has to be a willingness on the part of this Congress to step up and provide policy protection to the privacy rights of every American consumer.

I yield back.

WALDEN: Gentleman yields back.

Chair recognizes the gentleman from Illinois, Mr. Kinzinger, for four minutes.

REP. ADAM KINZINGER (R-ILL.): Thank you, Chairman. And, Mr. Zuckerberg, thank you for being here.

Given the global reach of Facebook, I'd like to know about the company's policies and practices with respect to information sharing with foreign governments, if you don't mind.

What personal data does Facebook make available from Facebook, Instagram, WhatsApp to Russian state agencies, including intel and security agencies?

ZUCKERBERG: Congressman, in — in general, the way we approach data and law enforcement is, if we have knowledge of imminent harm — physical harm that might happen to someone, we try to reach out to local law enforcement in order to help prevent that.

I think that that is less built out around the world. It is more built out in the U.S. So, for example, on that example, we built out specific programs in the U.S.

(CROSSTALK)

ZUCKERBERG: We have 3,000 people that are help — that are focused on making sure that, if we detect that someone is at risk of harming themselves, we can get them the appropriate ...

(CROSSTALK)

KINZINGER:

What about, like — what about Russian intel agencies?

ZUCKERBERG: The — the second category of — of information is when there is a valid legal process served to us. In general, if a government puts something out that's overly broad, we're going to fight back on it. We view our duty as protecting people's information.

But, if there is valid service, especially in the U.S., we will, of course, work with law enforcement. In general, we are not in the business of providing a lot of information to the Russian government.

KINZINGER: Do you know — is this data only from accounts located in or operated from these individual countries? Or does it include Facebook's global data?

ZUCKERBERG: Sorry, can you repeat that?

KINZINGER: Yeah. Is the data only from the accounts located in or operated from those countries, in terms of Russia or anything? Or does it include Facebook's global data?

ZUCKERBERG: Well, Congressman, in general, countries do not have jurisdiction to have any valid legal reason to request data of someone outside of their country.

KINZINGER: But where is it stored? Where is the data — do they have access to data only stored in ...

ZUCKERBERG: We don't store any data in Russia.

KINZINGER: Okay, so it's the global data.

ZUCKERBERG: Yes.

KINZINGER: So let me just ask — you mentioned a few times that we're in an arms race with Russia, but is it one-sided if Facebook, as an American-based company, has given the opposition everything it needs in terms of, you know, where it's storing its data?

ZUCKERBERG: Sorry, Congressman, could you repeat that?

KINZINGER: So you mentioned a few times that we're in an arms race with Russia.

ZUCKERBERG: Yes.

KINZINGER: If you're giving Russian intelligence service agencies, potentially, even on a valid request, access to global data that's not in Russia, is that kind of a disadvantage to us and an advantage to them?

ZUCKERBERG: Congressman, let me be more precise in my testimony.

KINZINGER: Sure. Yeah, please.

ZUCKERBERG: I have no specific knowledge of any data that we've ever given to Russia. In general, we'll work with valid law enforcement requests in different countries, and we can get back to you on what that might mean with Russia, specifically. But I have no knowledge, sitting here, of any time that we would have given them information.

KINZINGER: That would be great.

Now, I've got another unique one I want to bring up. So I was just today — and I'm not saying this as a “Woe is me,” but I think this happens to a lot of people — there have been — my pictures have been stolen and used in fake accounts all around, and, in many cases, people have been extorted for money.

We report it when we can, but we're in a tail chase. In fact, today, I just Googled — or I just put on your website, “Andrew Kinzinger,” and he looks a lot like me, but it says he's from London and lives in L.A. and went to Locke High School, which isn't anything like me at all.

These accounts pop up a lot, and, again, it's using my pictures, but extorting people for money. And we hear about it from people that call and say, “Hey, I was duped,” or whatever.

Can I — I know you can't control everything. I mean, it's — you have a huge platform, and — but can you talk about, maybe, some movements into the future to try to prevent that, in terms of maybe recognizing somebody's picture and if it's fake?

ZUCKERBERG: Yes, Congressman. This is an important issue, and it's — fake accounts, overall, are a big issue, because that's how a lot of the — the other issues that we see around fake news and foreign election interference are happening, as well.

So, long-term, the solution here is to build more A.I. tools that find patterns of people using the services that no real person would do. And we've been able to do that in order to take down tens of thousands of accounts, especially related to election interference leading up to the French election, the German election and, last year, the U.S. Alabama Senate state election — Senate election — special election.

And that's an area where we should be able to extend that work and develop more A.I. tools that can do this more broadly.

KINZINGER: Okay. Thank you.

WALDEN: The gentleman's time has expired.

Chair recognizes the gentleman from New Mexico, Mr. Lujan, for four minutes.

REP. BEN RAY LUJÁN (D-N.M.): Thank you, Mr. Chairman, and I want to pick up where Mr. Kinzinger dropped off, here.

Mr. Zuckerberg, Facebook recently announced that — a search feature allowing malicious actors to scrape data on virtually all of Facebook's 2 billion users.

Yes or no: In 2013, Brandon Copley, the CEO of Giftnix, demonstrated that this feature could easily be used to gather information at scale.

Well, the answer to that question is yes.

Yes or no: This issue of scraping data was again raised in 2015 by a cyber security researcher, correct?

ZUCKERBERG: Congressman, I'm not specifically familiar with that. The feature that we identified — I think it was a few weeks ago, or a couple weeks ago, at this point — was a search feature that allowed people to look up some information that people had publicly shared on their profiles.

LUJAN: Well ...

ZUCKERBERG: So names, profile pictures, public information.

LUJAN: If I may, Mr. Zuckerberg, I will recognize that Facebook did turn this feature off. My question, and the reason I'm asking about 2013 and 2015, is Facebook knew about this in 2013 and 2015, but you didn't turn the feature off until Wednesday of last week — the same feature that Mr. Kinzinger just talked about, where this is essentially a tool for these malicious actors to go and steal someone's identity and put the finishing touches on it.

So, again, you know, one of your mentors, Roger McNamee, recently said your business is based on trust, and you are losing trust. This is a trust question. Why did it take so long, especially when we're talking about some of the other pieces that we need to get to the bottom of?

Your failure to act on this issue has made billions of people potentially vulnerable to identity theft and other types of harmful, malicious actors.

So, on to another subject, Facebook has detailed profiles on people who have never signed up for Facebook. Yes or no?

ZUCKERBERG: Congressman, in general, we collect data of people who have not signed up for Facebook for security purposes, to prevent the kind of scraping that you were just referring to.

LUJAN: So these are called shadow profiles? Is that what they've been referred to by some?

ZUCKERBERG: Congressman, I'm not — I'm not familiar with that ...

(CROSSTALK)

LUJAN: I'll refer — I'll refer to them as shadow profiles for today's hearing. On average, how many data points does Facebook have on each Facebook user?

ZUCKERBERG: I do not know off the top of my head.

LUJAN: So the average for non-Facebook platforms is 1,500. It's been reported that Facebook has as many as 29,000 data points for an average Facebook user.

You know how many points of data that Facebook has on the average non-Facebook-user?

ZUCKERBERG: Congressman, I do not off the top of my head, but I can have our team get back to you afterwards.

LUJAN: I appreciate that.

It's been admitted by Facebook that you do collect data points on non-average users. So my question is, can someone who does not have a Facebook account opt out of Facebook's involuntary data collection?

ZUCKERBERG: Congressman, anyone can turn off and opt out of any data collection for ads, whether they use our services or not.

But, in order to prevent people from scraping public information, which — again, the search feature you brought up only showed public information — people's names and profiles and things that they had made public. But, nonetheless, we don't want people aggregating even public information.

LUJAN: But — so ...

(CROSSTALK)

ZUCKERBERG: ... block that, so we need to know when someone is trying to repeatedly access our services ...

LUJAN: If I may, Mr. Zuckerberg, I'm about out of time.

It may surprise you that we have not talked about this a lot today. You said everyone controls their data, but you're collecting data on people that are not even Facebook users, that have never signed a consent, a privacy agreement — and you're collecting their data.

And it may surprise you that, on Facebook's page, when you go to “I don't have a Facebook account and would like to request all my personal data stored by Facebook,” it takes you to a form that says, “Go to your Facebook page, and then, on your account settings, you can download your data.”

So you're directing people who don't have access — don't even have a Facebook page to have to sign up for a page to reach their data. We've got to fix that.

The last question that I have is have you disclosed to this committee or to anyone all the information Facebook has uncovered about Russian interference on your platform?

ZUCKERBERG: Congressman, we're working with the right authorities on that, and I'm happy to answer specific questions here, as well.

WALDEN: The gentleman's time is expired.

LUJAN: Thank you Mr. Chair.

WALDEN: The chair now recognizes the gentleman from Virginia, Mr. Griffith, for four minutes.

REP. H. MORGAN GRIFFITH (R-VA.): Thank you very much, Mr. Chairman. I appreciate — appreciate you being here.

Let me state up front that I share the privacy concerns that you've heard from a lot of us, and I appreciate your statements and willingness to, you know, help us figure out a solution that's good for the American people. So I appreciate that.

Secondly, I have to say that it's my understanding that, yesterday, Senator Shelley Moore Capito, my friend in my neighboring state of West Virginia, asked you about Facebook's plans with rural broadband, and you agreed to share that information with her at some point in time, get her up to date and up to speed.

I was excited to hear that you were excited about that and passionate about it. My district is very similar to West Virginia, as it borders it and we have a lot of rural areas. Can you also agree, yes or no, to update me on that when the information is available?

ZUCKERBERG: Yes, Congressman. We will certainly follow up with you on this.

Part of the mission of connecting everyone around the world means that everyone needs to be able to be on the Internet. And, unfortunately, too much of the Internet infrastructure today is too expensive for the current business models of carriers to support a lot of rural communities with the quality of service that they deserve.

So we are building a number of specific technologies, from planes that can beam down Internet access, to repeaters and mesh networks to make it so that — that all these communities can be served. And we'd be happy to follow-up with you on this to ...

(CROSSTALK)

GRIFFITH: I appreciate that. And we've got a lot of drone activity going on in our district, whether it's University of Virginia in Wise, or Virginia Tech. So we'd be happy to help out there, too.

Let me — let me switch gears. You talked about trying to ferret out misinformation. And the question becomes, who decides what is misinformation?

So, when the — some of my political opponents put on Facebook that, you know, they think Morgan Griffith is a bum, I think that's misinformation. What say you?

(LAUGHTER)

ZUCKERBERG: Congressman, without weighing in on that specific piece of content, let me outline the way that we approach fighting fake news in general.

There are three categories of fake news that we fight. One are basically spammers. They're economic actors, like — like the Macedonian trolls that I think we have all heard about — basically, folks who do not have an ideological goal. They're just trying to write the most sensational thing they can, in order to get people to click on it so they can make money on ads. It's all economics.

So the way to fight that is we make it so they can't run our ads, they can't make money. We make it so we can detect what they're doing and show it in less in news feeds, so they can make less money. When they stop making money, they just go and do something else, because they're economically inclined.

The second category are basically state actors, right, so what we've found with Russian interference. And those people are setting up fake accounts. So, for that, we need to build A.I. systems that can go and identify a number of their fake account networks.

And, just last week, we traced back the Russian activity to — to specific — a fake account network that Russia had in Russia to influence Russian culture and other Russian-speaking countries around them.

And we took down a number of their fake accounts and pages, including a news organization that was sanctioned by Russian — by the Russian government as a Russian state news organization. So that's a pretty big action. But removing fake accounts is the other way that we can fake — stop the spread of false information.

GRIFFITH: And I appreciate that. My time is running out.

I do want to point this out, though, as part of that: You know, who is going to decide what is misinformation? We've heard about the Catholic University and the cross. We've heard about a candidate. We've heard about the conservative ladies; a firearms shop, lawful, in my district had a similar problem. It has also been corrected.

And so I wonder if the industry has thought about — not only are we looking at it, but has the industry thought about doing something like Underwriters Laboratories, which was set up when electricity was new to determine whether or not the devices were safe?

Have you all thought about doing something like that, so it's not Facebook alone, but the industry, saying, “Wait a minute, this is probably misinformation,” and setting up guidelines that everybody can agree are fair?

ZUCKERBERG: Yes, Congressman. That's actually the third category that I was going to get to next, after economic spammers and state actors with fake accounts.

One of the things we're doing is working with a number of third parties who — so, if people flag things as — as false news or — or incorrect, we run them by third-party fact checkers, who are all accredited by the — this Pointer Institute of Journalism. There are ...

WALDEN: Gentleman's time ...

ZUCKERBERG: ... firms of all — of all leanings around this, who do this work, and that's — that's an important part of the effort.

WALDEN: Gentleman's time is expired.

GRIFFITH: I yield back.

WALDEN: Chair now recognizes the gentleman from New York, Mr. Tonko, for four minutes.

REP. PAUL TONKO (D-N.Y.): Thank you.

Mr. Zuckerberg, I want to follow up on a question asked by Mr. McNerney, where he talked about visiting websites and the fact that Facebook can track you, and, as you visit those websites, you can have that deleted.

I'm informed that there's not a way to do that. Or are you telling us that you are announcing a new policy?

ZUCKERBERG: Congressman, my understanding is that, if there's — if we have information from you visiting other places, then you have a way of getting access to that and deleting it and making sure that we don't store it anymore.

In the specific question that the — the other congressman asked, I think it's possible that we just didn't have the information that he was asking about in the first place, and that's why it wasn't there.

TONKO: Well, 3 billion user accounts were breached at Yahoo in 2013, 145 million at eBay in 2014, 143 million at Equifax in 2017, 78 million at Anthem in 2015, 76 million at JPMorgan Chase in 2014 — the list goes on and on.

The security of all that private data is gone, likely sold many times over to the highest bidder on the dark web. We live in an information age. Data breaches and privacy hacks are not a question of if. They are a question of when.

But the case with Facebook is slightly different. The 87 million accounts extracted by Cambridge Analytica are just the beginning, with, likely, dozens of other third parties that have accessed this information. As far as we know, the dam is still broken.

As you have noted, Mr. Zuckerberg, Facebook's business model is based on capitalizing on the private personal information of your users. Data security should be a central pillar of this model.

And, with your latest vast breach of privacy and the widespread political manipulation that followed it, the question that this committee must ask itself is what role the federal government should play in protecting the American people and the democratic institutions that your platform, and others like it, have put at risk.

In this case you gave permission to mine the data of some 87 million users, based on the deceptive consent — consent of just a fraction of that number. When they found out I was going to be speaking with you today, my constituents asked me to share some of their concerns in person.

How can they protect themselves on your platform? Why should they trust you again with their likes, their loves, their lives? Users trusted Facebook to prioritize user privacy and data security, and that trust has been shattered.

I'm encouraged that Facebook is committed to making changes, but I am indeed wary that you are only acting now out of concern for your brand and only making changes that should have been made a long time ago.

We have described this as an arms race, but, every time we saw what precautions you have or, in most cases, have not taken, your company is caught unprepared and ready to issue another apology. I'm left wondering again why Congress should trust you again. We'll be watching you closely to ensure that Facebook follows through on these commitments.

Many of my constituents have asked about your business model, where users are the product. Mary of Half Moon, in my district, called it infuriating. Andy of Schenectady, New York, asked, “Why doesn't Facebook pay its users for their incredibly valuable data?”

Facebook claims that users rightly own and control their data, yet their data keeps being exposed on your platform, and these breaches cause more and more harm each time.

You have said that Facebook was built to empower its users. Instead, users are having their information abused with absolutely no recourse. In light of this harm, what liability should Facebook have? When users' data is mishandled, who is responsible and what recourse do users have? Do you bear that liability?

ZUCKERBERG: Congressman, I think we're responsible for protecting people's information, for sure. But one thing that you said that I — that I want to provide some clarity on ...

TONKO: Do you bear the liability?

ZUCKERBERG: Well, you said earlier — you referenced that you thought that we were only taking action after this came to light. Actually, we made significant changes to the platform in 2014 that would have made this incident with Cambridge Analytica impossible to happen again today.

I wish we'd made those changes a couple of years earlier, because this poll app got people to use it back in 2013 and 2014. And, if we had made the changes a couple of years earlier, then we would have — then we ...

(CROSSTALK)

WALDEN: Gentleman's time has expired. Chair recognizes ...

TONKO: Mr. Chairman, if I might ask that other questions that my constituents have be answered by unanimous consent.

WALDEN: Sure. Without objection, of course. That's — that goes for all members.

Chair recognizes the gentleman from Florida, Mr. Bilirakis, for four minutes.

REP. GUS BILIRAKIS (R-FLA.): Thank you. Thank you, Mr. Chairman — appreciate it. And thanks for your testimony, Mr. Zuckerberg.

Well, first of all, I wanted to follow up with Mr. — Mr. McKinley's testimony. This is bad stuff, Mr. Zuckerberg, with regard to the illegal online pharmacies.

When are the — those ads — I mean, when are you going to take those off? I think we need an answer to that. I think they need to get off — we need to get these off as soon as possible.

Can you give us an answer, a clear answer as to when these pharmacies — we have an epidemic here with regard to the opioids. I think we're owed a clear answer, a definitive answer as to when these ads will be off — offline.

ZUCKERBERG: Congressman, if people flag those ads for us, we will take them down now.

BILIRAKIS: Now?

ZUCKERBERG: Yes.

BILIRAKIS: By the end of the day?

ZUCKERBERG: If people flag them for us, we will look at them as quickly as we can ...

(CROSSTALK)

BILIRAKIS: Well, you have knowledge now, obviously. You have knowledge — you have knowledge of those ads. Will you begin to take them out — down today?

ZUCKERBERG: The ads that are flagged for us, we will review and take down, if they violate our policies, which I believe the ones ...

(CROSSTALK)

BILIRAKIS: They clearly do. I — if they're illegal, they clearly violate your laws.

(CROSSTALK)

ZUCKERBERG: ... but — but what I think really needs to happen here is not just us reviewing content that gets flagged for us. We need to be able to build tools that can proactively go out and identify what might be these — these ads for — for opioids, before people even have to flag them for us to review.

BILIRAKIS: I agree.

ZUCKERBERG: And that's — that's going to be a longer term thing, in order to build that solution. So — but, today, if someone flags the ads for us, we will take them down.

BILIRAKIS: Work on those tools as soon as possible, please.

Okay. Next question. A constituent of mine in District 12 of Florida, the Tampa Bay area, came to me recently with what was clear — a clear violation of your privacy policy. In this case, a third-party organization publicly posted personal information about my constituent on his Facebook page.

This included his home address, voting record, degrading photos and other information. In my opinion, this is cyber bullying. For weeks, my constituent tried reaching out to Facebook on multiple occasions through its report feature, but the offending content remained. It was only when my office got involved that the posts were removed almost immediately for violating Facebook policy.

BILIRAKIS: How does Facebook's self-reporting policy work to prevent misuse? And why did it take an act of Congress — a member of Congress to get, again, a clear privacy violation removed from Facebook? If you can answer that question, I'd appreciate it, please.

ZUCKERBERG: Congressman, that clearly sounds like a big issue and something that would violate our policies. I don't have specific knowledge of that case, but what I imagine happened, given what you just said, is that they reported it to us and one of the people who reviews content probably made an enforcement error.

And then, when you reached out, we probably looked at it again and realized that it — that it violated the policies, and took it down. We have a number of steps that we need to take to improve the accuracy of our enforcement.

BILIRAKIS: Absolutely.

ZUCKERBERG: That's — that's a big issue. And we have to check content faster ...

BILIRAKIS: It has to be consistent.

ZUCKERBERG: ... and we need to — to be able to do better at this. I think the same solution to the opioid question that you raised earlier, of doing more with automated tools, will lead to both faster response times, and more accurate enforcement of the policies.

BILIRAKIS: Can you give us a timeline as to when will this be done? I mean, this is very critical for — I mean, listen, my family uses Facebook, my friends, my constituents. We all use Facebook. I use Facebook. It's wonderful ...

WALDEN: Gentleman's time ...

BILIRAKIS: ... for us seniors to connect with our relatives.

WALDEN: ... gentleman's time has expired.

BILIRAKIS: Yeah, I'm sorry. Can I submit for the record my additional questions?

WALDEN: Yes, sir.

BILIRAKIS: Thank you. Thank you so much ...

(CROSSTALK)

WALDEN: Without objection.

The chair recognizes the gentlelady from New York, Ms. Clarke, for four minutes.

REP. YVETTE D. CLARKE (D-N.Y.): I thank you, Mr. Chairman. And thank you for coming before us, Mr. Zuckerman (sic).

Today, I want to take the opportunity to represent the concerns of the newly formed Tech Accountability Caucus, in which I serve as a co-chair with my colleagues, Representative Robin Kelly, Congressman Emanuel Cleaver and Congresswoman Bonnie Watson Coleman, but, most importantly, people in our country and around the globe who are in vulnerable populations, including those who look just like me.

My first question to you is, as you may be aware, there have been numerous media reports about how more than 3,000 Russian ads were bought on Facebook to incite racial and religious division and chaos in the U.S. during the 2016 election.

Those ads specifically characterized and weaponized African American groups like Black Lives Matter, in which ads suggested, through propaganda — or fake news, as people call it these days — that they were a rising threat.

Do you think that the lack of diversity, culturally competent personnel in your C suite and throughout your organization, in which your company did not detect or disrupt and investigate these claims, are a problem in this regard?

ZUCKERBERG: Congresswoman, I agree that we need to work on diversity. In this specific case, I don't think that that was the issue, because we were, frankly, slow to identifying the whole Russian misinformation operation, and not just that specific example.

Going forward, we're going to address this by verifying the identity of every single advertiser who's running political or issue-oriented ads, to make it so that foreign actors or people trying to spoof their identity or say that they're someone that they're not cannot run political ads or run large pages of the type you're talking about.

CLARKE: So, were they — whether they were Russian or not, when you have propaganda, how are you addressing that? Because this was extremely harmful during the last election cycle and it — and can continue to be so in the — in the upcoming elections and throughout the year, right?

I'm concerned that there are not eyes that are culturally competent looking at these things and being able to see how this would impact on civil society. If everyone within the organization is monolithic, then you can miss these things very easily.

And we've talked about diversity forever, with your organization. What can you say today, when you look at how all of this operates, that you can do immediately to make sure that we have the types of viewing or reviewing that could enable us to catch this in its tracks?

ZUCKERBERG: Congresswoman, we announced a change in how we're going to review ads and big pages so that, now, going forward, we're going to verify the identity and location of every advertiser who's running political or issue ads or — and the identities ...

(CROSSTALK)

CLARKE: Good. We — we'd like you to get back to us with a timeline on that. This is ...

(CROSSTALK)

ZUCKERBERG: That will be in place for these elections.

CLARKE: Okay. Fabulous.

When Mr. Kogan sold the Facebook-based data that he acquired through the quiz app to Cambridge Analytica, did he violate Facebook's policies at the time?

ZUCKERBERG: Yes, Congresswoman.

CLARKE: When the Obama campaign collected millions of Facebook users' data through their own app during the 2012 election, did it violate Facebook's policies at the time?

ZUCKERBERG: No, Congresswoman, it did not.

CLARKE: I hope you understand that this distinction provides little comfort to those of us concerned about our privacy online. Regardless of political party, Americans desperately need to be protected. Democrats on this committee ...

WALDEN: Gentlelady's time ...

CLARKE: ... have been calling for strong privacy and data security legislation for years. We really can't wait.

Mr. Chairman, I yield back. Thank you, Mr. Zuckerberg.

WALDEN: Gentlelady's time has expired.

Chair recognizes the gentleman from Ohio, Mr. Johnson, for four minutes.

REP. BILL JOHNSON (R-OHIO): Thank you, Mr. Chairman. Mr. Zuckerberg, thanks for joining us today.

Let me add my list — my name to the list of folks that you're going to get back to on the rural broadband Internet access question. Please add my name to that list.

ZUCKERBERG: Of course.

JOHNSON: I got a lot of those folks in my district.

You know, you're a — you're a real American success story. There's no question that you and Facebook have revolutionized the way Americans — in fact, the world — communicate and interconnect with one another.

I think the reason that — one of the reasons that you were able to do that is because nowhere other than here in America, where a young man in college can pursue his dreams and ambitions on his own terms without a big federal government overregulating them and telling them what they can and cannot do, could you have achieved something like this.

But, in the absence of — of federal regulations that would reel that in, the only way it works for the betterment of society and people is with a high degree of responsibility and trust. And you've acknowledged that there have been some breakdowns in responsibility.

And I think, sometimes — and I'm a technology guy. I have two degrees in computer science. I'm a software engineer. I'm a patent holder. So I know the challenges that you face in terms of managing the technology.

But, oftentimes, technology folks spend so much time thinking about what they can do, and little time thinking about what they should do. And so I want to talk about some of those “should do” kind of things.

You heard earlier about faith-based material that had been — that had been taken down, ads that had been taken down. You admitted that it was a mistake. That was in my district, by the way — Franciscan University, a faith-based university, was the one that did that.

JOHNSON: How is your content filtered and determined to be appropriate, or not appropriate, and policy-compliant? Is it an algorithm that does it? Or is there a team of a gazillion people that sit there and look at each and every ad, that make that determination?

ZUCKERBERG: Congressman, it's a combination of both. So, at the end of the day, we have — we have community standards that are written out, and try to be very clear about what's — what is acceptable.

And we have a large team of people. As I said, by the end of this year, we're going to have about 20,000 — more than 20,000 people working on security and content review across the company.

But, in order to flag some content quickly, we also build technical systems in order to take things down. So, if we see terrorist content, for example, we'll flag that, and we can — we can take that down.

JOHNSON: What do — what you do when you — when you find someone or something that's made a mistake? I mean, I've heard you say several times today that you know a mistake has been made. What — what kind of accountability is there when mistakes are made?

Because, every time a mistake like that is made, it's a little bit of a chip away from the trust and the responsibility factors. How do you hold people accountable in Facebook, when they make those kind of mistakes of taking stuff down that shouldn't be taken down, or leaving stuff up that should not be left up?

ZUCKERBERG: Congressman, for content reviewers specifically, their performance is going to be measured by whether they do their job accurately, and ...

JOHNSON: Do you ever fire anybody when they do stuff like that?

ZUCKERBERG: I — I'm — I'm sure we do. As is part of the normal course of — of running a company, you — you're hiring and firing people all the time to grow your capacity, and — and to ...

(CROSSTALK)

JOHNSON: What happened to the — what happened to the person that took down the Franciscan University ad and didn't put it back up until the media started getting involved?

ZUCKERBERG: Congressman, I'm not specifically aware of that case.

JOHNSON: Could you take that question for me? My time is expired. Can you take that question for me and — and get me that answer back, please?

ZUCKERBERG: We will.

JOHNSON: Okay, thank you very much. I yield back.

WALDEN: The gentleman's time's expired.

The chair recognizes the gentleman from Iowa, Mr. Loebsack.

REP. DAVID LOEBSACK (D-IOWA): Thank you, Mr. Chairman. I want to thank you and the ranking member for holding this hearing today, and I want to thank Mr. Zuckerberg for being here today, as well.

Add my name to the rural broadband list, as well. I have one-fourth of Iowa, the southeast part of Iowa. We definitely need more help on that front. Thank you.

You may recall, last year, Mr. Zuckerberg, that you set out to visit every state in the country, to meet different people, and one of those places you visited was, in fact, Iowa — my home state of Iowa. And you did visit the district that I probably represent, and you met some of my constituents.

As you began your tour, you said that you believed in connecting the world and giving everyone a voice, and that, quote — you wanted, quote, “to personally hear more of those voices.”

I'm going to do the same thing in just a second that a number of my colleagues did, and just ask you some questions that were submitted to my Facebook page by some of my constituents.

I do want to say at the outset, though — and I do ask for unanimous consent to enter all those questions on the record, Mr. Chair ...

WALDEN: Without objection.

LOEBSACK: ... I think trust that has been the issue today. There's no question about it. I think that's what — what I'm hearing from my constituents. That's what we're hearing from our colleagues.

That's really the question: How can we be guaranteed that, for example, when you agree to some things today, that you're going to follow through, and that we're going to be able to hold you accountable.

And — and without, perhaps, constructing too many rules and regulations — we'd like to keep that to a minimum if we possibly can. But I do understand that you have agreed that we're going to have to have some rules and regulations so that we can protect people's privacy, so that we can protect that use of the consumer data.

So, going forward from there, I've just got a — a few questions I'll probably have an opportunity to get to. The first one goes to the business model issue, because you're publicly traded. Is that correct?

ZUCKERBERG: Yes.

LOEBSACK: And you're the CEO.

ZUCKERBERG: Yes.

LOEBSACK: Right.

And so I've got Lauren from Solon who asks, “Is it possible for Facebook to exist without collecting and selling our data?” Is it possible to exist?

ZUCKERBERG: Congressman, we don't sell people's data. So I think that that's an important thing to clarify up front. And then, in terms of collecting data, I mean, the whole purpose of the service is that you can share the things that you want with the people around you, right, or — and your friends. So ...

LOEBSACK: Is it — is it possible for you to be in business without sharing the data? Because that's what you have done, whether it was selling or not — sharing the data, providing it to Cambridge Analytica and other folks along the way.

Is it possible for your business to exist without doing that?

ZUCKERBERG: Well, Congressman, it would be possible for our business to exist without having a developer platform. It would not be possible for our business to — or — or our products or our services or anything that we do to exist without having the opportunity for people to go to Facebook, put in the content that they want to share and who they want to share it with, and then go do that. That's the core thing that ...

(CROSSTALK)

LOEBSACK: Okay, thank you. I — I appreciate that.

And then Brenda from Muscatine — she has a question, obviously, related to trust, as well, and that is, how will changes promised this time be proven to be completed? She'd like to know. How's that going to happen?

If there are changes — you said there have been some changes — how can she and those folks in our districts, and throughout America — not just members of Congress, but how can folks in our districts hold you accountable? How do they know that those changes are, in fact, going to happen? That's what that question's about.

ZUCKERBERG: Congressman, for the developer platform changes that we announced, they're implemented. We're putting those into place. We announced a bunch of specific things. It's on our — our blog, and I wrote it in my written testimony, and that stuff is happening.

We're also going back and investigating every single app that had access to a large amount of data before we locked down the platform in the past. We will tell people if we find anything that misused their data, and we will tell people when the investigation is complete.

LOEBSACK: Thank you.

And, finally, Chad from Scott County wants to know, “Who has my data, other than Cambridge Analytica?”

ZUCKERBERG: Congressman, part of what I just said is that we're going to do an investigation of every single app that had access to a large amount of people's data. If you — if you signed into another app, then that probably has access to some of your data.

And part of the investigation that we're going to do is — is to determine whether those app developers did anything improper, or shared that data further, beyond that. And, if we find anything like that, we will tell people that their — that their data was misused.

WALDEN: The gentleman's time is expired.

(CROSSTALK)

LOEBSACK: ... thank you, Mr. Chair.

WALDEN: Chair recognizes the gentleman from Missouri, Mr. Long, for four minutes.

REP. BILLY LONG (R-MO.): Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for being here today on a voluntary basis. I want to put that out here — you were not subpoenaed to be here, as Mr. Barton offered up a little bit ago.

We've had — you're the only witness at the table today. We've had 10 people at that table, to give you an idea of what kind of hearings we've had in here. Not too long ago, we had 10, and I'd say that, if we invited everyone that had read your terms of agreement — terms of service, we could probably fit them at that table.

I also would say that I had — represent 751,000 people, and, out of that 751,000 people, the people in my area that are really worked up about this — Facebook, and about this hearing today — would also fit with you there at the table.

So I'm not getting the outcry from my constituents about what's going on with Cambridge Analytica and — and this user agreement and everything else. But there are some things that I think you need to be concerned about. One question I'd like to ask before I move into my questioning is what was FaceMash, and is it still up and running?

ZUCKERBERG: No, Congressman. FaceMash was a — a prank website that I launched in college, in my dorm room, before I started Facebook. There was a movie about this — or it said it was about this. It was of unclear truth. And the — the claim that FaceMash was somehow connected to the development of Facebook — it isn't. It wasn't.

(CROSSTALK)

LONG: It's coincidental. The timing was the same, right? Just coincidental.

ZUCKERBERG: It was in 2003.

(CROSSTALK)

ZUCKERBERG: ... took it down, and it actually has nothing to do with Facebook.

LONG: You put up pictures of two women, and decide which one was the better — more attractive of the two, is that right?

ZUCKERBERG: Congressman, that is an accurate description of the prank website that I made when I was a sophomore in college.

LONG: Okay. Okay, I just — but, from that beginning — whether it was actually the beginning of Facebook or not — you've come a long way. Jan Schakowsky — Congresswoman Schakowsky, this morning, said self-regulation simply does not work.

Mr. Butterfield, Representative Butterfield, said that you need more African American inclusion on your board of directors. If I was you — a little bit of advice — Congress is good at two things: doing nothing, and overreacting.

So far, we've done nothing on Facebook. Since your inception in that Harvard dorm room, many years ago, we've done nothing on Facebook. We're getting ready to overreact. So take that as just a shot across the bow, warning to you.

You've got a good outfit there, on your front row, behind you, that — they're very bright folks. You're Harvard-educated. I have a Yale hat that costs me $160,000 — that's as close as I ever got to an Ivy League school.

But I'd like to show you, right now, a — a little picture here. You recognize these folks?

ZUCKERBERG: I do.

LONG: Who are they?

ZUCKERBERG: I — I believe — is that Diamond and Silk?

LONG: That is Diamond and Silk, two biological sisters from North Carolina. I might point out they're African American. And their content was deemed by your folks to be unsafe.

So, you know, I don't know what type of picture this is — if it was taken in a police station, or what, in a lineup — but apparently they've been deemed unsafe. Diamond and Silk have a question for you, and that question is, what is unsafe about two black women supporting President Donald J. Trump?

ZUCKERBERG: Well, Congressman, nothing is unsafe about that. The specifics of — of this situation, I — I'm not as up to speed on as — as I probably would be ...

(CROSSTALK)

LONG: ... you have 20,000 employees, as you said, to check content. And I would suggest, as good as you are with analytics, that those 20,000 people use some analytical research and see how many conservative websites have been pulled down, and how many liberal websites.

One of our talk show hosts at home — Nick Reed — this morning, on the radio, said that, if Diamond and Silk were liberal, they'd be on the late-night talk show circuit, back and forth. They're humorous, they have their opinion, not that you have to agree or that I have to agree — to agree — don't agree — with them.

But the fact that they're conservative — and I would just remember — if you don't remember anything else from this hearing here today, remember we do nothing and we overreact.

WALDEN: Gentleman's time ...

LONG: And we're getting ready to overreact. So I would suggest you go home and review all these other things people have accused you of today, get with your good team — they're behind you ...

WALDEN: ... gentleman's time's expired.

LONG: ... you're the guy to fix this. We're not. You need to save your ship. Thank you.

WALDEN: Gentleman's time has expired.

REP. JAN SCHAKOWSKY (D-ILL.): Mr. Chairman, since my name was mentioned, can I just respond?

WALDEN: Well I — I'd tell you, I'd — if we could move on, just because we're going to run out of time for members down dais to be able to ask their questions ...

SCHAKOWSKY: Okay, I'm going to — I consider Billy Long a good friend. Let me just say that I don't think it was a breach of decorum, and I just take issue with his saying that a very modest bill that I've introduced is an overreach. That's all.

LONG: I didn't say it was an overreach. All I said was that — I was just letting — reminding with several ...

(CROSSTALK)

WALDEN: I now recognize the gentleman from Oregon, Mr. Schrader, for questions for four minutes.

REP. KURT SCHRADER (D-ORE.): Thank you, Mr. Chairman, I appreciate that. Mr. Zuckerberg, again, thank you for being here — appreciate your — your good offices and voluntarily coming before us.

You have testified that you voluntarily took Cambridge Analytica's word that they had deleted information, found out subsequently that they did not delete that information, have sent in your own forensics team, which I — I applaud.

I just want to make sure — get some questions answered here. Can you tell us that they were not told — they were told not to destroy any data — misappropriated data they may find?

ZUCKERBERG: Congressman, so you're right that, in 2015, when we found out that the app developer, Aleksandr Kogan, had sold data to Cambridge Analytica, we reached out to them. At that point, we demanded that they delete all the data that they had.

They told us, at that point, that they had done that. And then, a month ago, we heard a new report that said that they actually hadn't done that.

SCHRADER: But I'm talking about the direction you've given your forensic team. Now, if they find stuff, they are not to delete it at this point in time? Or are they going to go ahead and delete it?

ZUCKERBERG: The audit team that we are sending in?

SCHRADER: Right.

ZUCKERBERG: The first order of business is to understand exactly what happened. And ...

(CROSSTALK)

SCHRADER: I'm worried about the — the information being deleted without law enforcement having an opportunity to actually review that.

Will you commit to this committee that neither Facebook nor its agents have removed any information or evidence from Cambridge Analytica's offices?

ZUCKERBERG: Congressman, I do not believe that we have. And ...

(CROSSTALK)

SCHRADER: How about Mr. Kogan's office, if I may ask?

ZUCKERBERG: ... one specific point on this is that our audit in the — of Cambridge Analytica — we have paused that in order to cede to the U.K. government, which is conducting its own government audit, which, of course — an investigation which, of course ...

(CROSSTALK)

SCHRADER: Yes, where I'm — with all due respect, what I'm getting at is I'd like to have the information available for the U.K. or U.S. law enforcement officials, and I did not hear you commit to that.

Will you commit to the committee that Facebook has not destroyed any data or records that may be relevant to any federal, state or international law enforcement investigation?

ZUCKERBERG: Congressman, yes. What I'm saying is that the U.K. government is going to complete its investigation before we go in and do our audit. So they will have full access to all the information.

SCHRADER: You suspended your audit, pending the U.K.'s investigation?

ZUCKERBERG: Yes, we've — we've — we've paused it, pending theirs.

SCHRADER: So it's my understanding that you and other Facebook executives have the ability to rescind or delete messages that are on people's websites.

To be clear, I just want to make sure that, if that is indeed the case — that, after you've deleted that information — that, somehow, law enforcement — particularly relevant to this case — would still have access to those messages.

ZUCKERBERG: Congressman, yes. We have a document retention policy at the company where, for some people, we delete emails after a period of time, but we, of course, preserve anything that there's a legal hold on.

SCHRADER: Great. Well, I appreciate that.

While you've testified very clearly that you do not sell information — it's not Facebook's model; you do the advertising and obviously have other means of revenue — but it's pretty clear others do sell that information.

Doesn't that make you somewhat complicit in what they're doing, your allowing them to sell the information that they glean from your website?

ZUCKERBERG: Well, Congressman, I would disagree that we allow it. We actually expressly prohibit any developer that people ...

(CROSSTALK)

SCHRADER: How do you — how do you enforce that? That's my concern. How do you enforce that? Complaint only is what I've heard so far tonight.

ZUCKERBERG: Yes, Congressman. Some of it is — is in response to reports that we get, and some of it is we do spot checks to make sure that the apps are actually doing what they — what they say they're doing. And, going forward, we're going to increase the number of audits that we do, as well.

SCHRADER: So last question is it's my understanding, based on the testimony here today, that, even after I'm off of Facebook — that you guys still have the ability to follow my web interactions. Is that correct?

ZUCKERBERG: Congressman ...

SCHRADER: I've logged out of Facebook. Do you still have the ability to follow my interactions on the web?

ZUCKERBERG: ... Congressman, you have control over what we do for — for ads and the information collection around that. On security, there may be specific things about how you use Facebook, even if you're not logged in, that we — that we keep track of, to make sure that people aren't abusing the systems.

ZUCKERBERG: ... Congressman, you have control over what we do for — for ads and the information collection around that. On security, there may be specific things about how you use Facebook, even if you're not logged in, that we — that we keep track of, to make sure that people aren't abusing the systems.

WALDEN: Gentleman's time has expired.

And, just for our — our members who haven't had a chance to ask questions, we will pause at 1:30 — well, we will have votes at 1:40. We will continue the hearing after a — a brief pause, and we'll — we'll coordinate that.

We'll go now to Dr. Bucshon.

REP. LARRY BUCSHON (R-IND.): Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg, for being here.

There are plenty of anecdotal examples, including from family members of mine, where people will be verbally discussing items, never having actively been on the Internet at the time, and then, the next time they get on Facebook or other online apps, ads for things that they were verbally discussing with each other will show up.

And I know you said in the Senate that Facebook doesn't listen — specifically listen to what people are saying through their — through their phone, whether that's a Google phone or whether it's Apple or another one.

However, the other day, my mother-in-law and I were discussing her brother, who had been deceased for about 10 years, and, later on that evening, on — on her Facebook site, she had a — she had, set to music, kind of a in memoriam picture collage that came up Facebook, specifically to her brother. And that happened the other night.

So, if you don't — you're not listening to us on the phone, who is? And do you have specific contracts with — with these companies that will provide data that you — is being acquired verbally through our — through our phones or, now, through things like Alexa or other — other products?

ZUCKERBERG: Congressman, we're not collecting any information verbally on the microphone, and we don't have contracts with anyone else who is.

The only time that we might use the microphone is when you're recording a video or doing something where you intentionally are trying to record audio. But we don't have anything that is trying to listen to what's going on in the background.

BUCSHON: Okay, because, I mean — like I said, I mean, you've talked to people that this has happened to. My son who lives in Chicago was — him and his colleagues were talking about a certain type of suit, because they're business guys, and, the next day, he had a bunch of ads for different suits on — on that, when he went onto the Internet.

So it's pretty obvious to me that someone is — is listening to the audio on — on our phones, and that — I see that as a pretty big issue. And the reason is — is because — and you may not be, but I see this as a pretty big issue for — because, for example, if you're in your doctor's office, if you're in your corporate boardroom, your office or even personal areas of your home, that's potentially an issue.

And I'm glad to hear that Facebook isn't listening, but — but I'm skeptical that someone isn't. And I — I see this as an industry-wide issue that you could potentially help address.

And the final thing I'll just ask is that, when you have, say, an executive session or whatever, your corporate board, and you have decisions to be made, do you allow the people in the room to have their phones on them?

ZUCKERBERG: Congressman, we do. I don't think we have a policy that says that your phone can't be on. And, again, I'm not that — I'm not familiar with — Facebook doesn't do this, and I'm not familiar with other companies that — that do, either.

My understanding is that a lot of these cases that you're talking about are a coincidence, or someone is — might be talking about something, but then they also go to a website or interact with it on Facebook, because they were talking about it, and then maybe they'll see the ad because of that, which is a much clearer statement of the — the intent.

BUCSHON: Okay. Because, if — if that's the case, then — I mean, I know, for convenience, companies have developed things like Alexa, and I don't want to — and other companies are developing things like that.

But it just seems to me that the whole — part of the whole point of those product is not just for your own convenience, but, when you're verbally talking about things and then you're not on the Internet, they're able to collect information on the type of activities that — that you're engaging in.

So I'd — I'd implore the industry to — to look into that and make sure that, in addition to physical — exploring the Internet and collecting data, that data being ...

(CROSSTALK)

BUCSHON: ... taken verbally not be allowed. Thank you.

WALDEN: The gentleman's time is expired.

Chair recognizes the gentleman from Massachusetts, Mr. Kennedy, for four minutes.

REP. JOSEPH KENNEDY III (D-MASS.): Thank you, Mr. Chairman. Mr. Zuckerberg, thank you for being here. Thank you for your patience and — over both days of testimony.

You spoke about the framing of your testimony about privacy, security, and democracy. I want to ask you about privacy and democracy, because I think, obviously, those are linked.

You have said over the course of questioning yesterday and today that users own all of their data. So I want to make sure that we drill down on that a little bit, but I think our colleagues have tried.

That includes, I believe, that the Facebook — that — the information that Facebook requires users to make public — so that would be a profile picture, gender, age range — all of which is public-facing information. That's right?

ZUCKERBERG: Yes.

KENNEDY: Okay. So can advertisers, then — understanding that you, Facebook, maintain the data; you're not selling that to anybody else — but advertisers clearly end up having access through that — through agreements with you about how they, then, target ads to me, to you, to any other user.

Can advertisers in any way use nonpublic data — so data that individuals would not think is necessarily public — so that they can target their ads?

ZUCKERBERG: Congressman, the way this works is — let's say you have a business that is selling skis, Okay, and you have on your profile that you are interested in skiing. But let's say you haven't made that public, but you share it with your — with your friends, all right?

So, broadly, we don't tell the advertiser that — “Here's a list of people who like skis.” They just say, “Okay, we're trying to sell skis. Can you reach people who like skis?” And then we match that up on our side, without sharing any of that information with the advertisers.

KENNEDY: Understood. They don't — you don't share that, but they get access to that information so that — if they know — they want to market skis to me, because I like skis.

On the realm of data that is accessible to them, does that include — does Facebook include deleted data?

ZUCKERBERG: Congressman, no. And I — I also would push back on the idea that we're giving them access to the data. We allow them to reach people who have said that on Facebook, but we're not giving them access to data.

KENNEDY: Fair, fair.

So can advertisers, either directly or indirectly, get access to or use the metadata that Facebook collects in order to more specifically target ads?

So that would include — I know you've talked a lot about how Facebook would use access to information for folks that — well, I might be able to opt in or out about your ability to track me to other websites. Is that used by those advertisers, as well?

ZUCKERBERG: Congressman, I'm not sure I understand the question. Can you — can you give me an example of what you mean?

KENNEDY: So does — essentially, does — the advertisers that are using your platform — do they get access to information that the user doesn't actually think is either, one, being generated, or, two, is public?

Understanding that, yes, if you dive into the details of your — your platform, users might be able to shut that off, but I think one of the challenges with trust here is that there's an awful lot of information that's generated, that people don't think that they're generating, and that advertisers are being able to target because Facebook collects it.

ZUCKERBERG: Yes.

So, Congressman, my understanding is that the targeting options that are — that are available for advertisers are generally things that are based on what people share.

Now, once an advertiser chooses how they want to target something, Facebook also does its own work to help rank and determine which ads are going to be interesting to which people.

ZUCKERBERG: So we may use metadata or other behaviors of what you've shown that you're interested in on news feed or other places in order to make our systems more relevant to you. But that's a little bit different from giving that as an option to an advertiser, if that makes sense.

KENNEDY: Right. But, then, I guess, the question back to — and I've only got 20 seconds. I think one of the rubs that you're hearing is I don't understand how users, then, own that data. I think that's part of the rub.

Second, you focus a lot of your testimony and the questions on the individual privacy aspects of this. But we haven't talked about the societal implication of it.

And I think, while I applaud some of the reforms that you're putting forward, the underlying issue here is that your platform has become a — a ...

WALDEN: Gentleman's time ...

KENNEDY: ... mix of — two seconds — news, entertainment, social media that is up for manipulation. We've seen that with a foreign actor. If the changes to individual privacy don't seem to be sufficient to address that underlying issue ...

WALDEN: Gentleman's time has expired.

KENNEDY: ... I'd love your comments on that at the appropriate time. Thank you.

WALDEN: Chair recognizes the gentleman from Texas, Mr. Flores, for four minutes.

REP. BILL FLORES (R-TEX.): Thank you Mr. Chairman. Mr. Zuckerberg, thank you for being here today. I'm up here, top row. I'm certain there are other things you'd rather be doing.

The activities of Facebook and other technology companies should not surprise us. I mean, we've seen it before — and again, don't take this critically. But we saw a large oil company become a monopoly back in the late 1800s, early 1900s. We saw a large telecommunications company become a near-monopoly in the '60s, '70s and '80s.

And, just as Facebook — and these companies were founded by bright entrepreneurs. Their companies grew. And, eventually, they sometimes became detached from everyday Americans. And what happened is policymakers then had to step in and reestablish the balance between those — those folks and everyday Americans.

You didn't intend for this to happen. It did happen, and I appreciate that you've apologized for it. And one of the things I appreciate about Facebook — it appears you're proactively trying to address the situation.

Just as we addressed those monopolies in the past, we're faced with that similar — that situation today. We need to — and this — this goes beyond Facebook. This has to do with the edge providers. It has to do with social media organizations and also with ISPs.

Back to — to Facebook in particular, though, we heard examples yesterday, during the Senate hearing, and also today, during this hearing, so far, about ideological bias among the users of Facebook.

In my Texas district, I have a retired schoolteacher whose conservative postings were banned or stopped. The good news is I was able to work with Facebook's personnel and get her reinstated. That said, the Facebook centers still seem to be trying to stop her postings. And I — anything you can do in that regard to fix that bias will go a long way.

I want to move a different direction; that's to talk about the future. Congress needs to consider policy responses, as I said earlier. And I want to call this policy response Privacy 2.0 and Fairness 2.0.

With respect to fairness, I think the technology companies should be ideologically agnostic regarding their users' public-facing activities. The only exception would be for potentially violent behavior.

I'll ask — my — my question is, on this, do you agree that Facebook and other technology platforms should be ideologically neutral?

ZUCKERBERG: Congressman, I — I agree that we should be a platform for all ideas, and that we should focus on that.

FLORES: Good.

ZUCKERBERG: I ...

FLORES: I've got to — I've got limited time.

With respect to privacy, I think that we need to set a baseline. When we talk about a virtual person that each technology user establishes online — their name, address, their online purchases, geolocation, data, websites visited, pictures, et cetera — I think that the individual owns the virtual person they set up online.

My second question is this. You've said earlier that each user owns their virtual presence. Do you think that this concept should apply to all technology providers, including social media platforms, edge providers and ISPs?

ZUCKERBERG: Congressman, yes. In general, I mean, I think that people own their ...

(CROSSTALK)

FLORES: Thank you. I'm not trying to catch you off. You can provide more information supplementally, after, if you don't mind.

In this regard, I believe that Congress enact — if Congress enacts privacy standards for technology providers, just as we have for financial institutions, health care, employed benefits, et cetera, the policy should state that the data of technology users should be held privately unless they specifically consent to the use of the data by others.

This release should be based on the absolute transparency as to what data will be used, how it will be processed, where — how — where it will be stored, what algorithms will be applied to it, who will have access to it, if it will be sold and to whom it might be sold.

The disclosure of this information and the associated opt-in actions should be easy to understand and easier for nontechnical users to execute. The days of the long-scrolling fine-print disclosures with a single check mark at the bottom should end.

In this regard, based on my use of ...

WALDEN: Gentleman's ...

FLORES: ... Facebook, I think you've come a long way toward meeting that objective. I think we must move further. I'll have two questions to submit later.

And thank you — if you can expand on your responses to my earlier questions later, thank you.

WALDEN: Gentleman's time has expired.

Chair recognizes the gentleman from California for four minutes, Mr. Cardenas.

REP. TONY CÁRDENAS (D-CALIF.): Thank you very much. Seems like we've been here forever, don't you think?

Well, thank you, Mr. Chairman, Ranking Member, for holding this important hearing. I'm of the opinion that, basically, we're hearing from one of the leaders — the CEO of one of the biggest corporations in the world — but yet almost entirely in an environment that is unregulated, or, for basic terms, that — the lanes in which you're supposed to operate in are very wide and broad, unlike other industries.

Yet, at the same time, I have a chart here of the growth of Facebook. Congratulations to you and your shareholders. It shows that, in 2009, your net value of the company was less than — or revenue was less than a billion dollars. And then you look all the way over to 2016 — it was in excess of $26 billion.

And then, in 2017, apparently, you're about close to $40 billion. Are those numbers relatively accurate about the growth and the phenomenon of Facebook?

ZUCKERBERG: Congressman, these sound relatively accurate.

CARDENAS: Okay.

It — just so you know, just brought to my attention — my staff texted me a little while ago that the CEO of Cambridge Analytica apparently stepped down, some time today. I don't know if anybody of your team there whispered that to you, but my staff just reported that.

That's interesting. The fact that the CEO of Cambridge Analytica stepped down — does that in and of itself solve the issue and the controversy around what they did?

ZUCKERBERG: Congressman, I don't think so. There are — there are a couple of big issues here. One is what happened specifically with Cambridge Analytica — how were they able to buy data from a developer that people chose to share it with? And how do we make sure that that can't happen again?

CARDENAS: But some of that information did originate with Facebook, correct?

ZUCKERBERG: People had it on Facebook, and then chose to share theirs and some of their friends' information with this developer, yes.

CARDENAS: Something was brought to my attention most recently that apparently safe book — Facebook does, in fact, actually buy information to add or augment the information that you have on some of your users, to build, around them, their profile.

ZUCKERBERG: Congressman, we just recently announced that we were stopping working with data brokers as part of the ad system. It's ...

CARDENAS: But you did do that to build your company, in the past?

ZUCKERBERG: It's — it's an industry standard ad practice, and, recently, upon examining all of our systems, we decided that's not a thing that we want to be a part of, even if everyone else is doing it.

CARDENAS: But you did engage in that, as well — not just everybody else, but Facebook yourselves — you did engage in that?

ZUCKERBERG: Yes, until we announced that we're shutting it down. Yes.

CARDENAS: Okay. It's my understanding that, when The Guardian decided to report on the Cambridge Analytica consumer data issue, Facebook threatened to sue them if they want forward with their — their story. It appears — did it happen something like that? Facebook kind of warned them, like, “Hey, maybe you don't want to do that”?

ZUCKERBERG: Congressman, I don't believe that. I think that there may have been a specific factual inaccuracy that we ...

CARDENAS: So, in other words, you checking The Guardian and saying, “You're not going to want to go out with that story because it's not 100 percent factual” — that's ...

(CROSSTALK)

ZUCKERBERG: ... that specific point, yes.

CARDENAS: Okay. Now — but, however, they did go through with their story, regardless of the warnings or the threats of Facebook saying that “You don't — not going to want to do that.”

When they did — did do that — and only then did Facebook actually apologize for that incident, for that 89 million users' information, unfortunately, ending up in their hands. Isn't that the case?

ZUCKERBERG: Congressman, you're right that we apologized after they posted the story. They had the — most of the details of what was — of what was right there.

CARDENAS: Okay.

ZUCKERBERG: And I don't think we objected to that.

CARDENAS: Thank you.

ZUCKERBERG: There was a specific thing ...

(CROSSTALK)

CARDENAS: Okay. But I only have a few more seconds.

My — my main point is this: I think it's time that you, Facebook — if you want to truly be a leader in all the sense of the word and recognize that you can, in fact, do right by American users of Facebook and when it comes information, unfortunately, getting in the wrong hands — you can be a leader.

Are you committed to actually being a leader in that sense?

WALDEN: Chairman — the gentleman's time.

CARDENAS: Can you give a two second answer?

WALDEN: Sure.

ZUCKERBERG: Congressman, I'm — I am definitely committed to taking a broader view of our responsibility. That's what my testimony is about, making sure that we don't just give people tools, but make sure that they're used for good.

CARDENAS: Thank you very much. Thank you, Mr. Chairman.

WALDEN: And, with that, we will recess for about five minutes, 10 minutes. We'll recess for 10 minutes and then resume the hearing.

(RECESS)

WALDEN: All right, we're going to reconvene the Energy and Commerce Committee, and we will go next to the gentlelady from Indiana, Ms. Brooks, for four minutes to resume questioning.

REP. SUSAN BROOKS (R-IND.): Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for being here today. It's so critically important that we hear from you and your company because we do believe that is critically important for you to be a leader in these solutions.

One thing is that has been talked about just very little, but I think is very important and I want to make sure there is appropriate attention on how the platform of Facebook but even other platforms — and you've mentioned it a little bit — how you help us in this country keep our country safe from terrorists. And so it's a — I talked with lots of people who actually continue to remain very concerned about recruitment of their younger family members, and now we're seeing around the globe and enhanced recruitment of women as well to join terrorist organizations. And so I'm very, very concerned. I'm a former U.S. attorney.

And so when 9/11 happened, you didn't exist. Facebook did not exist, but since the evolution, after 9/11, we know that al-Shabab, al-Qaeda, ISIS, has used social media like we could not even imagine.

So can you please talk about — and then you talked about the fact that if there is content that is objectionable or is a danger that people report it to you, but what if they don't? What if everybody assumes that someone is reporting something to you. So I need you to help assure us as well as the American people, what is Facebook's role, leadership role, in helping us fight terrorism and help us stop the recruitment, because it is still a grave danger around the world?

ZUCKERBERG: Congresswoman, thanks for the question. Terrorist content and propaganda has no place in our network and we have developed a number of tools that have now made it so that 99 percent of the ISIS and al-Qaeda content that we take down is identified by the systems and taken down before anyone our system even flags it for us.

So that's an example of removing harmful content that we're proud of, and I think is a model for other types of harmful content as well.

BROOKS: Can I ask though — and I appreciate, and I heard you say 99 percent — and yet I didn't go out and, you know, look for this, but yet, as recently as March 29th ISIS content was discovered on Facebook, which included an execution video, March 29th.

On April 9th there were five pages located, on April 9th, of Hezbollah content, and so forth.

And so, what is the mechanism that you're using? Is it artificial intelligence? Is it the 20,000 people? What are you using to — because it's not — I appreciate that no system is perfect, but yet this is just within a week.

ZUCKERBERG: Congressman, it's a good question, and it's a combination of technology and people. We have a counterterrorism team at Facebook.

BROOKS: How large is it?

ZUCKERBERG: Two hundred people are just focused on counterterrorism, and there are other content reviewers who are reviewing content that gets flagged to them as well. So those are folks who are working specifically on that. I think we have capacity in 30 languages that we're working on.

In addition to that we have a number of A.I. tools that we're developing, like the ones that I mentioned that can proactively go flag the content.

BROOKS: And so you might have those people looking for the content. How are they helping block the recruiting?

ZUCKERBERG: Yes so there's ...

BROOKS: Is it still — your platform as well as Twitter and then WhatsApp is how they then begin to communicate which I understand you own. Is that correct?

ZUCKERBERG: Yes.

BROOKS: So how are we stopping the recruiting and the communications?

ZUCKERBERG: So we identify what might be the patterns of communication or messaging that they might put out and then design systems that can proactively identify that and flag those for our teams. That way we can go and take those down.

BROOKS: Thank you. My time is up. I thank you and please continue to work with us and all the governments who are trying to fight terrorism around the world.

ZUCKERBERG: Thank you. We will.

And, Mr. Chairman, if you don't mind before we go to the next question, there was something I wanted to correct in my testimony from earlier, when I went back and talked to my team afterwards.

WALDEN: Sure.

ZUCKERBERG: I'd said that if — if — this was in response to a question about whether web logs that — that we had about a person would be able to download your information. I had said that they were. And I clarified with my team that in fact, the Web logs are not and download your information. We only store them temporarily, and we convert the Web logs into a set of ad interests, that you might be interested in those ads, and we put that in the “download your information” instead, and you have complete control over that. So I just wanted to clarify that one for the record.

WALDEN: I appreciate that. Thank you.

We go now to the gentleman from California, Mr. Ruiz.

REP. RAUL RUIZ (D-CALIF.): Thank you, Mr. Chairman, and thank you, Mr. Zuckerberg, for appearing before the committee today.

The fact, is Mr. Zuckerberg, Facebook failed its customers. You said as much yourself. You've apologized and we appreciate that.

We as Congress have a responsibility to figure out what went wrong here and what could be done differently to better protect consumers private digital data in the future.

So my first question for you, Mr. Zuckerberg, is why did Facebook not notify the FTC in 2015 when you first discovered this had happened, and was it the legal opinion of your current company that you are under no obligation to notify the FTC, even with the 2011 consent order in place?

ZUCKERBERG: Congressman, in retrospect, it was a mistake and we should and I wish we had identified — notified and told people about it.

RUIZ: Did you think that ...

ZUCKERBERG: The reason why we didn't ...

RUIZ: ... the rules were kind of lax, that you were sort of debating whether you needed to or something?

ZUCKERBERG: Yes, Congressman, I don't believe that — that we necessarily had a legal obligation to do so. I just think it was probably ...

RUIZ: Okay.

ZUCKERBERG: ... I think that it was the right thing to have done. The reason why we didn't do it at the time ...

RUIZ: Well — well — well, you answered my question. Would you agree that for Facebook to continue to be successful, it needs to continue to have the trust of its users?

ZUCKERBERG: Absolutely.

RUIZ: Great. So does this not, perhaps, strike you as a weakness with the current system; that you are not required to notify the FTC of a potential violation of your own consent decree with them, and that you did not have clear guidelines for what you as a company needed to do in this situation to maintain the public's trust, and act in their best interest?

ZUCKERBERG: Congressman, regardless of what the laws or regulations are that are in place, we take a broader view of our responsibilities around privacy, and I think that we should have notified people, because it would have been the right thing to do, and I've committed ...

(CROSSTALK)

RUIZ: I'm just trying to think of the other CEO who might not have such a broad view, and might interpret the different legal requirements, maybe, differently. So that's why I'm asking these questions. I'm — I'm — I'm also taking a broad view as a Congressman here, to try to fix this problem.

So from what we've learned over the past two days of hearings, it just doesn't seem like the FTC has the necessary tools to do what needs to be done to protect consumer data and consumer privacy, and we can't exclusively rely on companies to self-regulate in the best interest of consumers. So Mr. Zuckerberg, would — would it be helpful if there was an entity clearly tasked with overseeing how consumer data is being collected, shared and used, and which could offer guidelines, at least guidelines for companies like yours to ensure your business practices are not in violation of the law, something like a digital consumer protection agency?

ZUCKERBERG: Congressman, I think it's an idea that deserves a lot of consideration. I think — I — I'm not the type of person who thinks that there should be no regulation, especially because the Internet is getting to be so important in people's lives around the world. But I think the details on this really matter, and whether it's an agency, or a law that is passed, or the FTC has certain abilities, I — I that is — is is all something that we should be ...

RUIZ: Well, one of the things that we're realizing is that there's a lot of holes in the system; that — that, you know, we don't have the toolbox, you don't have the toolbox to monitor 9 million apps, and tens of thousands of — of data collectors, and there's no specific mechanism for you to collaborate with those that can help you prevent these things from happening. And so I think that — that perhaps if we — if we started having these discussions about what would have been helpful for you to build your toolbox, and for us to build our toolbox, so that we can prevent things like Cambridge Analytica, things like identity thefts, things like what, you know, what we're seeing — what we've heard about today.

So thank — you know, I just want to thank you for your thoughts and testimony. So it's clear to me that this is the beginning of many, many conversations on the topic, and I look forward to working with you and the committee to — to better protect consumer privacy.

ZUCKERBERG: Congressman, we look forward to following up, too.

RUIZ: Thank you.

WALDEN: Now go to gentleman from Oklahoma, Mr. Mullin, for four minutes.

REP. MARKWAYNE MULLIN (R-OKLA.): Thank you, Mr. Chairman, and sir, thank you for being here. I appreciate you using the term “Congressman” and “Congresswoman.” My name's Markwayne Mullin, and feel free to use that name.

Sir, I — I just want to tell you, first of all, I want to commend you on your ability to not just invent something, but to see it through its — through its growth. We see a lot of venturers have the ability to do that, but to manage it, and to see that — see it through its tremendous growth period takes a lot of talent, and you can show — by your showing here today, you — you handle yourself well, so — so thank you on that. And you also do that by hiring the right people, so I commend you on doing that, also. You hire people, obviously, based on their ability to get the job done.

Real quick, a couple questions I have, and I'll give you time to answer it. Isn't it the consumers' responsibility to some degree to control the content to which they release?

ZUCKERBERG: Congressman, I believe that people should have the ability to choose to share their data how they want, and they need to understand how that's working. But I — I agree with what you're saying, that people want to have the ability to move their data to another app, and we want to give them the tools to — to do that.

MULLIN: Right. And — and does the device settings, does it really help you protect what information is released? For instance, there's been a lot of talk about them searching for something, maybe on Google, and then the advertisement pops up on Facebook. Isn't there a setting on most devices to where you can close out the browser without Facebook interacting with that?

ZUCKERBERG: Yes, Congressman. On — on most devices, the way the operating systems is architected would prevent something that you do in another app like Google from being visible to — to the Facebook app.

MULLIN: See, I — I come from the — from the background of believing that everything I do, I assume is open for anybody to take when I'm on the Internet. I — I understand that it is — it is privacy concerns, but you're still releasing it to something farther than a pen and pad. So once I'm — once I'm on the Web, or I'm on an app, then that information is subject to — to going, really, anyplace. All I can do is protect it the best I can by my settings.

And so what I'm trying to get to is, as a — as an individual, as a user of Facebook, how can someone control keeping the content within the realm that they want to keep it, without it being collected? You say that, you know, you don't sell it. However, you do — you do sell advertisement. As a business owner, I have a demographic that I go after, and I search advertisers that — that market to that demographic. So you collect information for that purpose, right?

ZUCKERBERG: Congressman, yes, we — we collect information to make sure that the ad experience on Facebook can be relevant and valuable to small businesses ...

MULLIN: Sure.

ZUCKERBERG: ... and — and others who want to reach people.

MULLIN: Value-based. But if I don't — If I'm a customer or a user of Facebook, and I don't want that information to be shared, how do I keep that from happening? Is there settings within the app that I need to go to to set — to block all that?

ZUCKERBERG: Congressman, yes, there is. There is a setting, so if you don't want any data to be collected around advertising, you can — you can turn that off, and then we won't do it.

In general, we offer a lot of settings over every type of information that you might want to share on Facebook, in every way that you might interact with the system, from here's the content that you put on your page, to here is who can see your interests, to here's how you might show up in — in search results if people look for you, to here's how the — how you might be able to sign into developer apps, and login with Facebook, and — and advertising. And we — we try to make the controls as easy to understand as possible. You know, it's a — it's a broad service. People use it for a lot of things, so there are a number of controls, but we try to make it as easy as possible, and — and to put those controls in front of people so that they can configure the experience in a way that they want.

MULLIN: Would that have kept apps from seeking our information, if that's ...

WALDEN: The gentleman's time.

MULLIN: Thank you. I appreciate it. Thank you, Chairman.

ZUCKERBERG: Thank you.

WALDEN: Recognize now the gentleman from California for four minutes.

PETERS: Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg, for being with us today, and I — you know, it's been a long day.

I want to — I — I think we can all agree that technology has outpaced the law, with respect to the protection of private information. I wonder if you think it would be reasonable for Congress to define the legal duty of privacy that's owed by private companies to their customers, with respect to their personal information.

ZUCKERBERG: Congressman, I think that that makes sense to discuss, and I agree with the broader point that I think you're making, which is that the Internet and technology overall is just becoming a much more important part of all of our lives.

The — the companies in the technology industry are — are growing ...

PETERS: Right, that's what I mean by it's outpaced, and I — I wonder, I want to take — I would also want to take you at your work, I believe you're sincere that you personally place a high value on consumer privacy and that — that personal commitment is significant at Facebook today coming from you, given your position, but I also observe, and you'd agree, that the performance on privacy has been inconsistent.

I wonder, you know, myself whether that's because it's not a bottom line issue. It — it — it appears that the shareholders are interested in — in maximizing profits, privacy neither — certainly doesn't drive profits I don't think, but also may interfere with profits if you have to sacrifice your ad revenues because of privacy concerns.

Would it not be appropriate for — for us once we define this — this duty to assess financial penalties in a way that would sufficiently send a signal to the shareholders and to your employees — who you must be frustrated with too — that the privacy you're so concerned about is a bottom line issue at Facebook?

ZUCKERBERG: Congressman, it's certainly something that we can consider, although one thing that I would push back on is I think it is often characterized as maybe these mistakes happen because there's some conflict between what people and business interests. I actually don't think that's the case. I think a lot of these hard decisions come down to different interests between different people.

So for example, on the one hand people want the ability to sign into apps and bring some of their information and bring some of their friend's information in order to have a social experience. And on the other hand, everyone wants their information locked down and completely private. And the question is — it's not a business question as much as which of those equities do you weigh more?

PETERS: I think part of it is that, but — but part of it also what happened with Cambridge Analytica, some of this data got away from us, and I'd suggest to you that if — if there were financial consequences to that that made a difference to the business, not people dropping their Facebook accounts, they would get more attention.

And it's not so much a — a business model choice — I congratulate you on your business model — but it's that these issues aren't getting the — the bottom line attention that — that I think would have given — made them a priority with respect of Facebook.

Let me just follow up in my final time on a — on an exchange you had with Senator Graham yesterday about regulation and — and I — I think Senator said, do you as a company welcome regulation, and you said, if it's the right regulation, then yes. Question, do you think that the Europeans have it right? And you said, I think they get some things right. I wanted you to elaborate on what the Europeans got right, and what do you think they got wrong?

ZUCKERBERG: Congressman, well there are — there are a lot of things that the — that the Europeans do, and — and I think that — I think that GDPR in general is — is going to be a very positive step for the Internet, and it codifies a lot of the things in there are things that we've done for a long time. Some of them are things that — that I think would be — would be good steps for us to take. So for example, the controls that — that this requires, are generally controls, privacy controls that we've offered around the world for years.

Putting the tools in front of people repeatedly, not just having them in settings, but putting them in front of people and getting — and making sure that people understand what the controls are and that they get affirmative consent, I think it's a good thing to do that we've done periodically in the past, but I think it makes sense to do more, and I think that's something the GDPR will — will require us to do and — and will be positive.

PETERS: Anything you think they got wrong?

ZUCKERBERG: I would — I need to think about that more.

PETERS: Well I would appreciate it if you could respond in writing. I really — again, really appreciate you being here.

Thank you Mr. Chairman.

WALDEN: Thank you. We'll go now to the gentleman from North Carolina, Mr. Hudson, for four minutes.

HUDSON: Thank you. Thank you, Mr. Zuckerberg, for being here. This is a long day. You're here voluntarily, and we sure appreciate you — you being here.

I can say from my own experience, I've hosted two events with Facebook in my district in North Carolina working with small business and finding ways they can increase their customer base on Facebook, and it's been very beneficial to us, so I thank you for that.

I do want to pin this slightly and frame the discussion in other light for my question. One of the greatest honors I have as I represent the men and women of Fort Bragg, epicenter of the universe, home of the airborne special operations, you visited last year.

ZUCKERBERG: I did.

HUDSON: Very well received, so you understand that due to the sense of nature of some of the operations these soldiers conduct, many are discouraged or even prohibited from having a social media presence.

However, there are others who — who still have profiles or some who may have deleted their profiles upon entering military service. Many have family members who have Facebook profiles. And as we've learned, each one of these user's information may have been shared without their consent.

There's no way that Facebook can guarantee the safety of this information on another company's server that they sell this information. If private information can be gathered by apps without explicit consent of the user, they're almost asking to be hacked.

Are you aware of the national security concerns that would come from allowing those who seek to harm our nation access to information such as the geographical location of members of our Armed Services? Is this something that you're — you're looking at?

ZUCKERBERG: Congressman, I'm not — I'm not specifically aware of — of that threat, but in general, there are a number of national security and election integrity-type issues that we focus on, and we try to take a very broad view of that. And the more input that we can get from the intelligence community as well, encouraging us to — to look into specific things, the more effectively we can do that work.

HUDSON: Great, well I'd love to follow up with you on that. It's been said many times here that you refer to Facebook as a platform of all ideas — or a platform for all ideas. I know you've heard from many yesterday and today about concerns regarding Facebook censorship of content, particularly content that may promote Christian beliefs of conservative political beliefs.

I have to bring up Diamond & Silk again because they're actually from my district, but — but I think you've addressed these concerns, but I think it's also become very apparent, and I hope it's become very apparent to you, that this is a very serious concern.

I actually asked on my Facebook page for my constituents to give me ideas of things they'd like for me to ask you today, and the most common question was about personal privacy.

So this is something that I — I think there is an issue, there — there's issues that your company, in terms of trust with consumers, that I think you need to deal with. I think you recognize that based on your testimony today.

But my question to you is, what is the standard that Facebook uses to determine what is offensive or controversial, and how has that standard been applied across Facebook's platform?

ZUCKERBERG: Congressman, this is an important question. So there are a couple of standards. The strongest one is things that will cause physical harm, or threats of physical harm, but then there is a broader standard of — of hate speech and speech that might make people feel just broadly uncomfortable or unsafe in the community.

HUDSON: That's probably the most difficult to define, so I guess my question is how do you apply — what standards do you apply to try to determine what's hate speech versus what's just speech you may disagree with?

ZUCKERBERG: Congressman, that's a very important question, and I think is — is one that we struggle with continuously, and the question of, what is hate speech versus what is legitimate political speech is, I — I think, something that we get criticized both from the left and the right on what the definitions are that we have.

It's — it is — it's nuanced, and what we try to — we try to lay this out in our community standards, which are public documents, that we can make sure that you and your — your office get to look through the definitions on this, but this is an area where I think society's sensibilities are also shifting quickly, and it's also very different and ...

(CROSSTALK)

HUDSON: I'm just running out of time here. I hate to cut you off. But let me just say that, you know, based on the statistics Mr. Scalise shared and the anecdotes we can provide you, it seems like there's still a challenge when it comes to conservative (inaudible), and I hope you will address that.

(CROSSTALK)

ZUCKERBERG: I agree.

HUDSON: With that, Mr. Chairman, I'll stop talking.

WALDEN: Gentleman's time's expired. We now go to the gentleman from New York, Mr. Collins for four minutes.

COLLINS: Thank you, Mr. Chairman. And I wasn't sure where I would be going with this, but when you're number 48 out of 54 members you know you can do a lot of listening, and I've tried to do that today. And to — to frame where I am now, I think, first of all, thank you for coming. And there's a saying, you don't know what you know until you know it. And I really think you've done a — a great benefit to Facebook and yourself in particular as we now have heard, without a doubt, Facebook doesn't sell data. I think the narrative would be, of course you sell data. And now we all know across America you don't sell data. I think that's very good for you, a very good clarification.

The other one is that the whole situation we're here is because a third-party app developer, Aleksandr Kogan, didn't follow through on the rules. He was told he can't sell the data. He gathered the data, and then he did what he's not supposed to and he sold that data. And it's very hard to anticipate a bad actor doing what they're doing until after they've done it, and clearly you took actions after 2014.

So one real quick question is, what did change in, you know, 10 or 20 or 30 seconds? What data was being collected before you locked down the platform, and how did that change to today?

ZUCKERBERG: Congressman, thank you.

So, before 2014 when we announced the change, a — someone could sign into an app and share some of their data, but also could share some basic information about their friends. And in 2014 the major change was we said, now you're not going to be able to share any information about your friends. So if you and your friend both happen to be playing a game together or on an app that — listening to music together, then that app could have some information from both of you because you both had signed in and authorized that app. But other than that, people wouldn't be able to share information from their friends.

So that the basic issue here were 300,000 people used this poll and came — and the app and then ultimately sold it to Cambridge Analytica and Cambridge Analytica had access to as many as 87 million people's information wouldn't be possible today. Today if 300,000 people used an app, the app might have information about 300,000 people.

COLLINS: And — and I think that's a very good clarification as well because people were wondering how does 300,000 become 87 million. So that — that's also something that's good to know. And — and you know, I guess my last minute as I've heard the tone here, I've got to give you all the credit in the world. You — I could tell from the tone, we would say the other side sometimes when we point to our left, but when the representative from Illinois to quote her said, “Who is going to protect us from Facebook?” I mean that threw me back in my chair. I mean, that was certainly an aggressive — we'll — we'll use the polite word “aggressive,” but I think out of bounds kind of comment. Just my opinion.

And I've said I was interviewed by a couple of folks in the break and I said, you know, as I'm listening to you today I'm quite confident that you truly are doing good. You believe in what you're doing. 2.2 billion people are using your platform. And I sincerely know in my heart that you do believe in — in keeping all ideas equal, and you may vote a certain way or not but that doesn't matter.

You've got 27,000 employees and I think the fact is that you're operating under a Federal Trade Commission consent decree from 2011. That's a real thing, and it goes for 20 years. So when someone said, do we need more regulations, or do we need more legislation? I said no.

Right now what we have is Facebook with a CEO that — that's mind is in the right place doing the best you can with 27,000 people, but the consent decree does what it does. I mean, there would be significant financial penalties were Facebook to ignore that consent decree.

So I think as I'm hearing this meeting going back and forth, I for one think it was beneficial. It's good. I don't think we need more regulations and legislation now, and I want to congratulate you, I think, on doing a good job here today in presenting your case, and we now know we didn't know before hand. So thank you again.

ZUCKERBERG: Thank you.

WALDEN: Okay. Now I think we go next in order to Mr. Walberg actually, who was here when the gavel dropped. So we will go to Mr. Walberg for four minutes.

WALBERG: Well, thank you, Mr. Chairman. I appreciate that. And I — Mr. Zuckerberg, I appreciate you being here as well. It has been interesting to listen to all of the comments from both sides of the aisle. To get an idea of the breadth, length, depth, the vastness of our World Wide Web, social media and more specifically Facebook.

I want to ask three starter questions. Don't think they'll take a long answer but I'll let you — let you answer. Earlier you indicated that there were bad actors, and that triggered your platform policy changes in 2014, but you didn't identify who those bad actors where. Who were they?

ZUCKERBERG: Congressman, I — I don't sitting here today remember a lot of the specifics of — of early on, but we saw generally a bunch of app developers who were asking for permissions to access people's data in ways that weren't connected to the functioning of an app. So they'd just say, Okay, if you want to log in to my app, you — you would have to share all this content, even though the app doesn't actually use that in any reasonable way. So we looked at that and said, hey, this isn't — this isn't right.

Or we should review these apps and make sure that if an app developer's going to ask someone to access their data that they actually have a reason why they want to access to it. And over time, that we — we made a series of changes that culminated in the major change in 2014 that I referenced before where ultimately we made it so now a person could sign in but not bring their friends information with them anymore.

WALBERG: Secondly, is there any way, any way, that Facebook can with any level of certainty ensure Facebook users that every single app on it's platform is not misusing their data?

ZUCKERBERG: Congressman, it would be difficult to ever guarantee that any single — that — that — that there are — that there are no bad actors. Every problem around security is — is sort of an arms race, where you have people who are trying to abuse systems, and our responsibility is to make that as hard as possible and to take the — the necessary precautions for a company of our scale. And I think that the responsibility that we have is growing with our scale and we need to make sure that we ...

WALBERG: And I think that — I think that's an adequate answer. It's a truthful answer. Can you assure me that ads and content are not being denied based on particular views?

ZUCKERBERG: Congressman, yes politically. Although I — I — I think what you — when I hear that what I hear is kind of normal political speech. We certainly are not going to allow ads for terrorist content for example so ...

WALBERG: Let me — let me ...

ZUCKERBERG: ... banning those views.

(CROSSTALK)

WALBERG: And I wanted to bring up a — a screen grab that we had, again going back to Representative Upton earlier on was his constituent, but was my legislative director for a time. It was his campaign ad that he was going to boost his post, and he was rejected. It was rejected as being — it said here, ad wasn't approved because it doesn't allow — doesn't follow advertising policies, we don't allow ads that contain shocking, disrespectful or sensational content, including ads that depict violence or threats of violence. Now, as I read that, I also know that you have since — or Facebook has since declared no, that was a mistake; an algorithm problem that went on there.

But that's our concern that we have, that it wouldn't be because he had his picture with a veteran, it wouldn't be because he wanted to reduce spending, but pro-life, second amendment, those things and conservative, that causes us some concerns.

So I guess what I'm saying here, I believe that we ought to have a light touch in regulation. And when I hear some of my friends on the other side of the aisle decry the fact of what's going on now, and they were high-fiving what took place in 2012 with President Obama and what he was capable of doing in bringing in and grabbing, for use in a political way.

I would say the best thing we can do is have these light-of-day hearings, let you self-regulate as much as possible with a light touch coming from us but recognizing that, in the end, your Facebooks or subscribers are going to tell you what you need to do ...

WALDEN: Gentleman's time ...

WALBERG: So thank you for your time and thank you for the time you've given me.

WALDEN: Yes. Now recognize the gentlelady from California, Ms. Walters, for four minutes.

WALTERS: Thank you. Thank you, Mr. Chairman. And thank you, Mr. Zuckerberg, for being here. One of my biggest concerns is the misuse of consumer data and what controls users have over their information. You have indicated that Facebook users have granular control over their own contact — content and who can see it.

As you can see on the screen, on the left is a screenshot of the on-off choice for apps which must be on for users to use apps that require a Facebook login and which allows apps to collect your information.

On the right is a screenshot of what a user sees when they want to change the privacy settings on a post, photo or other content. Same account, same user. But which control governs? The app platform access or the user's decision as to who they want to see a particular post?

ZUCKERBERG: Sorry, could you repeat that?

WALTERS: So, which — which app governs, Okay? Or which control governs? The app platform access or the user's decision as to who they want to see a particular post? So if you look up there on the screen.

ZUCKERBERG: Congresswoman, so when you're using the service, if you share a photo, for example, and you say “I only want my friends to see it,” then in news feed and Facebook, only your friends are going to see it. If you then go to a website and then you want to sign into that website, that website can ask you and say “Hey, here are the things that — that I want to get access to in order for you to use the website.”

If you sign in after seeing that screen where the website is asking for certain information, then you are also authorizing that website to have access to that information. If you've turned off the platform completely, which is what the control is that you have on the left, then you wouldn't be able to sign in to another website. You'd have to go reactivate this before that would even work.

WALTERS: Okay, do you think that the average Facebook user understands that is how it works? And how would they find this out?

ZUCKERBERG: Congresswoman, I think that these, that the settings when you're signing into an app are quite clear in terms of, every time you go to sign into an app, you have to go through a whole screen that says “Here's the app, here's your friends who use it, here are the pieces of information that it would like to have access to.” You make a decision whether you sign in, yes or no. And until you say “I want to sign in,” nothing gets shared.

Similarly, in terms of sharing content, every single time that you go to upload a photo, you have to make a decision — it's right there at the top, it says “are you sharing this with your friends or publicly or with some group,” and every single time that's — that's quite clear. So in those cases, yes, I think that this is quite clear.

WALTERS: Okay, so these user control options are in different locations. And it seems to me that putting all privacy control options in a single location would be more user-friendly. Why aren't they in the same location?

ZUCKERBERG: Congresswoman, we typically do two things. We have a settings page that has all of your settings in one place in case you want to go and play around or configure your settings. But the more important thing is putting the settings in line when you're trying to make a decision. So if you're going to share a photo now, we think that your setting about who you want to share that photo with should be in line right there.

If you're going to sign into an app, we think that the — it should be very clear right in line when you're signing into the app what permissions that app is asking for. So we do both. It's both in one place in settings if you want to go to it, and it's in line in the relevant place.

WALTERS: Okay. California has been heralded by many on this committee for its privacy initiatives. Given that you and other major tech companies are in California and we are still experiencing privacy issues, how do you square the two?

ZUCKERBERG: Can you repeat that?

WALTERS: So, given that you and other major tech companies are in California and we're still experiencing privacy issues, how do you square the two?

ZUCKERBERG: What was the other piece?

WALTERS: California's been heralded by many in this committee for its privacy initiatives.

ZUCKERBERG: Well, Congresswoman, I think that privacy is not something that you can ever — it's — our understanding of the issues between people and how they interact online only grows over time. So I think we'll figure out what the social norms are and the rules that we want to put in place. Then five years from now, we'll come back and we'll have learned more things and either that'll just be that social norms have evolved and the company's practices have evolved or we'll put rules in place.

But I think that our understanding of this is going to evolve over quite a long time. So I would expect that even if a state like California's forward-leaning, that's not necessarily going to mean that we fully understand everything or have solved all the issues.

WALDEN: Gentle — gentlelady's time has expired. Recognize the gentlelady from Michigan, Ms. Dingell for four minutes.

DINGELL: Thank you, Mr. Chairman. Mr. Zuckerberg, thank you for your patience. I am a daily Facebook user. Much to my staff's distress, I do it myself. And because we need a little humor, I'm even married to a 91-year-old man that's thinking of Twitter.

But I know Facebook's value. I've used it for a long time. But with that value also comes obligation. We've all been sitting here for more than four hours.

Some things are striking during this conversation. As CEO, you didn't know some key facts. You didn't know about major court cases regarding your privacy policies against your company. You didn't know that the FTC doesn't have fining authority and that Facebook could not have received fines for the 2011 consent order. You didn't know what a shadow profile was. You didn't know how many apps you need to audit.

You did not know how many other firms have been sold data by Dr. Kogan other than Cambridge Analytica and Eunoia Technologies, even though you were asked that question yesterday. And yes, we were all paying attention yesterday. You don't even know all the kinds of information Facebook is collecting from its own users.

Here's what I do know. You have trackers all over the Web.

DINGELL: On practically every website you go to, we all see the Facebook Like or Facebook Share buttons. And with the Facebook pixel, people browsing the Internet may not even see that Facebook logo. It doesn't matter whether you have a Facebook account. Through those tools, Facebook is able to collect information from all of us. So I want to ask you, how many Facebook like buttons are there on non-Facebook Web pages?

ZUCKERBERG: Congresswoman, I don't know the answer to that off the top my head, but we'll get back to you.

DINGELL: Is the number over hundred million?

ZUCKERBERG: I believe we've served the like button on pages more than that, but I don't know the number of pages that have the like button on actively.

DINGELL: How many share buttons are there on non-Facebook Web pages?

ZUCKERBERG: I don't know the answer to that exactly off the top my head either, but that's something that we can follow up with you on.

DINGELL: And do we think that's over 100 million likely? How many chunks of Facebook pixel code are there on non-Facebook Web page?

ZUCKERBERG: Congresswoman, you're asking some specific stats that I don't know off the top of my head, but we can follow up with you and get back to you on all of these.

DINGELL: Can you commit to get the committee, the European Union is asking for 72 hours on transparency? Do you think we could get that back in committee in 72 hours?

ZUCKERBERG: Congresswoman, I will talk to my team and we will follow up.

DINGELL: I know you're still reviewing, but do you know now whether there are other fourth parties that had access to the data from someone other than Dr. Kogan? Or is this something we're going to find out in a press release down the road? I think what worries all of us and you've heard it today is it has taken almost three years to hear about that. And I am convinced that there are other people out there.

ZUCKERBERG: Congresswoman, as I've said a number of times, we're now going to investigate every single app that access to a large amount of people's information in the past before we lock down the platform.

I do imagine that we will find some apps that — that were either doing something suspicious or misused people's data, if we find them, then we will ban them from the platform, take action to make sure they delete the data and make sure that everyone involved is informed.

DINGELL: And you will make it public quickly? Not three years.

ZUCKERBERG: As soon as we find them.

DINGELL: So I just — I'm going to conclude because my times almost up that I worry that when I hear companies value our privacy, it's meant in monetary terms, not the moral obligation to protect it. Data protection and privacy are like clean air and clean water, there need to be clear rules of the road.

WALDEN: Gentlelady's time has expired. Chair recognizes the gentleman from Pennsylvania, Mr. Costello, for four minutes.

COSTELLO: Thank you, Mr. Chairman. I would echo Congressman Collins comments as well. Mr. Zuckerberg, I think that we as Americans have a concept of digital privacy rights and privacy that aren't necessarily codified.

And we're trying to sift through how do we actually make privacy rights in a way that are intelligible for tech and understandable to the community at large? And so my questions are oriented in that fashion. First, if you look at GDPR, the E.U. — the law that's about to take effect, what pieces of that do you feel would be properly placed in American jurisprudence?

In other words, right to erasure, right to get our data back, right to rectify, could you share with us how you see that playing out, not just for you, but for the smaller companies. Because I do believe you have a sincere interest in seeing small tech companies prosper.

ZUCKERBERG: Yes, Congressman. So there are a few parts of GDPR that I think are important and — and good. One is making sure that people have control over how each piece of information that they share used.

So people should have the ability to know what a company knows about them, to control and have a setting about who can see it and to be able to delete it whenever they want. The second set of things is making sure that people actually understand what the tools are that are available.

So not just having it in some settings page somewhere, but put the tools in front of people so that they can make a decision. And that both builds trust and makes inside people's experiences are configured in the way that they want.

That's something that we've done a number of times over the years at Facebook. But with GDPR, we will now be doing more and around the whole world. The third piece is there are some very sensitive technologies that I think are important to enable innovation around like face recognition, but that you want to make sure that you get special consent for.

Right, it's if we — if we make it too hard for American companies to innovate in areas like facial recognition, then we will lose to Chinese companies and other companies around the world where — that are able innovate in that.

COSTELLO: Do you feel you should be able to deploy AI for facial recognition for a non-FB user?

ZUCKERBERG: Congressman, I think that that's a — that's a good question. And I think that this is something that probably — that — that we should — that people should have control over, how it is used and that we're going to be rolling out and asking people whether they want us to use it for them around the world as part of this — this push that's upcoming. But I think in general for — for sensitive technologies like that, I do think you want a special consent.

COSTELLO: Right.

ZUCKERBERG: And I think that's a — that would be a valuable thing to consider.

COSTELLO: Two — two quick ones. Does — is Facebook, in utilizing that platform, ever a publisher in your mind?

ZUCKERBERG: Congressman ...

COSTELLO: You would say you're responsible for content, right, you said that yesterday. Are you ever a publisher, as the term is legally used?

ZUCKERBERG: Congressman, I'm not familiar with how the term is legally used.

COSTELLO: Would you ever be legally responsible for the content that is put onto your platform?

ZUCKERBERG: Well, Congressman, let me put it this way, there is content that we fund, specifically in video today.

COSTELLO: Right.

ZUCKERBERG: And when we're commissioning a video to be created, then I certainly think we have full responsibility ...

COSTELLO: Agreed.

ZUCKERBERG: ... of owning — of owning that content.

COSTELLO: Which is what I think Chairman's Walden question was upfront. Right.

ZUCKERBERG: But the vast majority of the content on Facebook is not something that we commissioned. For that, I think our responsibility is to make sure that the content on Facebook is not harmful, that people are seeing things that are relevant to them and that encourage interaction and building relationships with the people around them. And that, I think, is — is the primary responsibility that we have.

COSTELLO: My big concern, I'm going to run out of time, is that a — is someone limits their data to not being used for something that it might potentially be used for that they have no idea what it — how it might actually socially benefit.

And I'm out of time, but I would like for you to share at later point in time, how the data that you get might be limited by user and your inability to use that data may actually prevent the kind of innovation that would bring about positive social change in this country.

Because I do believe that was the intention and objective to — of your company. And I do believe you perform it very, very, very well in a lot of ways. Thank you. I yield back.

ZUCKERBERG: Thank you.

WALDEN: Gentleman yields back. We go now to the gentleman Georgia, Mr. Carter, for four minutes.

CARTER: Thank you, Mr. Chairman. Thank you, Mr. Zuckerberg for being here. You're almost done. When you get to me, that means you're getting close to the end. So congratulations. Thank you for being here. We do appreciate it.

You know, you wouldn't be here if it wasn't for the — the privacy — people's information and the privacy, and — and the fact that we had — you had this laps. You know all about fake news, you know all about foreign intervention. I know you're concerned about that. I want to talk about just a — a few different subjects, if you will.

And I'd like to ask you just some yes or no questions, please excuse my redundancy. I know that some members have already asked you about some of these subjects, but I would like to ask you. Mr. Zuckerberg, did you know that 91 people die every day because opioid addiction? Yes or no, did you know that? Ninety one people every day.

ZUCKERBERG: I did not know that specifically.

CARTER: Did you know that there's — it's estimated to be between two and a half to 11 and a half million people in this country right now who are addicted to opioids?

ZUCKERBERG: Yes.

CARTER: Okay, did you know that the average age of Americans has decreased for the first time in decades as a result of — what people are saying is a result of the opioid epidemic?

ZUCKERBERG: Yes, especially among certain demographics.

CARTER: Absolutely. I ask you this because some of the other members have mentioned that — about the ads for fentanyl and other illicit drugs that are on the Internet, and the — where you can buy them, and about your responsibility to — to monitor that and make sure that's not happening.

I had the opportunity this past week to speak at the Prescription Drug Abuse and Heroin Summit in Atlanta that Representative Hal Rogers started some years ago. Also we had the FDA Commissioner there, and he mentioned the fact that he's going to meeting with CEO's of Internet companies to discuss this problem.

I hope that you will be willing to at least have someone there to meet with him so that we can get your help in this, this is extremely important.

ZUCKERBERG: Congressman, I will make sure that someone is there. (Inaudible).

CARTER: Okay, let me ask you another question. Mr. Zuckerberg, did you know that there are groups of conservations — there are conservation groups that have provided evidence to the Securities and Exchange Commission that endangered wildlife goods, in particular ivory is extensively traded on closed groups on Facebook?

ZUCKERBERG: Congressman, I was not specifically aware of that, but I think we — we know that — that there are issues with content like this, that we need more proactive monitoring for.

CARTER: Okay, let me — all right, well let me ask you, did you know that there are some conservation groups that assert that there's so much ivory being sold on Facebook that it's literally contributing to the extent — to the extinction of the elephant species?

ZUCKERBERG: Congressman, I have not heard that.

CARTER: Okay, and — and did you know that the American — or excuse me, the Motion Picture Association of America is having problems with piracy of movies and of their products, and that not only is this challenging their profits, but their very existence. Did you know that that was a problem?

ZUCKERBERG: Congressman, I believe that has been an issue for a long time.

CARTER: It has been. It has been, so you did know that. Well the reason I ask you this is that I just want to make sure that I understand you have an understanding of a commitment. Look I — you said earlier, may have been yesterday that hate speech is difficult to discern.

And I get that, I understand that and you're absolutely right. But these things are not and we need your help with this. Now I will tell you there are members of this body who would like to see the Internet monitored as a utility.

I am not one of those, I believe that that would be the worst thing we could do. I believe it would stifle innovation, I don't think you can legislate morality and I don't want to try and do that. But we need a commitment from you that these things that can be controlled like this, that you will help us.

And that you'll work with law enforcement to — to help us with this. Look, you love America, I know that, we all know that. We need your help here. We don't — I don't want Congress to have to act. You — you want to see a mess, you let the federal government get into this.

You'll see a mess, I assure you.

WALDEN: Gentleman's ...

CARTER: Please, we — we need your help with this. And I just need that commitment, can I get that commitment?

ZUCKERBERG: Congressman yes, we take this very seriously. That's a big part of the reason overall these content issues why, by the end of this year, we're going to have more than 20,000 people working on security and content review.

And we need to build more tools, too.

CARTER: Thank you very much.

WALDEN: Gentleman's time has expired. Chair recognizes, Mr. Duncan, for four minutes.

DUNCAN: Thank you Mr. Chairman. Usually I'm last, but today I think we have one behind me that came in late. Mr. Zuckerberg, I want to ...

WALDEN: Only by two minutes, did he come in late.

(LAUGHTER)

DUNCAN: I want to thank you for all the work you've done. And I want to let you know that I've been on Facebook since 2007. Started as a state legislator, used Facebook to communicate with my constituents. And it has been an invaluable tool for me in communicating.

We can actually do in real time multiple issues as we deal with them in here in Congress, answer questions. It's almost like a town hall in real time. I also want to tell you that your staff here at the Governmental Affairs Office, Chris Herndon and others do a fabulous job in keeping us informed.

So I want to thank you for that. Before this hearing when we heard about it, we asked our constituents and our friends on Facebook, what would they want me to ask you? And the main response was addressing the perceived, and in many instances confirmed bias and viewpoint discrimination against Christians and conservatives on your platform.

Today, listening to this, I think the two main issues are user privacy and censorship. Constitution of the United States and the First Amendment says, “Congress shall make no law respecting an establishment of religion, nor prohibiting the free exercise thereof. Nor will they abridge the freedom of speech of the press, the right of people to assemble or address the Congress for address of grievances — or petition Congress to address for grievances.” I've got a copy of the Constitution I want to give you at the end of this hearing.

The reason I say all that, this is maybe a rhetorical question but why not have a community standard for free speech and free exercise of religion that is simply a mirror of the First Amendment, with algorithms that are viewed — that have a viewpoint that is neutral? Why not do that?

ZUCKERBERG: Well Congressman, I think that we can all agree that certain content like terrorist propaganda should have no place on our network. And the First Amendment, my understanding of it, is that that kind of speech is allowed in the world.

I just don't think that it is the kind of thing that we want to allow to spread on the Internet. So once you get into that, you're already — you're deciding that you — you take this value that you care about safety. And that we don't want people to be able to spread information that can cause harm.

And I think that that — it — our general responsibility is to — is to allow the broadest spectrum of free expression as we can ...

DUNCAN: And I appreciate — I appreciate that answer. You're right about propaganda and other issues there. And I believe the Constitution generally applies to government and says that Congress shall make no law respecting — talks about religion.

And then we don't want to bridge the freedom of speech or the press. But the standard has been applied to private businesses, whether those are newspapers or other media platform. And I would argue that social media has now become a media platform to be considered in a lot of ways the same as other press media.

So I think the First Amendment probably does apply and will apply. What will you do — and let me ask you this, what will you do to restore the First Amendment rights of Facebook users and ensure that all users are treated equally, regardless of whether they're conservative, moderate, liberal or whatnot?

ZUCKERBERG: Well Congressman, I think that we — we make a number of mistakes in content review today that I don't think only focus on one political persuasion. And I think it's unfortunate that when those happen, people think that we're focused on them.

And it happens in different political groups, and it's — we have ...

DUNCAN: In the essence of time, conservatives are the ones that raise the awareness that their content has been pulled. I don't see the same awareness being raised by liberal organizations, liberal candidates or liberal policy statements.

So I think — and I think you've been made aware of this over the last two days, probably need to go back and make sure that those things are treated equal. And I would appreciate if you do that. Again, I appreciate the platform, I appreciate the work that you do.

And we stand willing and able to help you here in Congress, because Facebook is an invaluable part of what we do and how we communicate, so thanks for being here.

ZUCKERBERG: Thank you.

DUNCAN: I yield back.

WALDEN: And for our final four minutes of questioning comes from Mr. Cramer, North Dakota, former head of the Public Utility Commission there. We welcome your comments. Go ahead.

CRAMER: Thank you, and thanks for being here, Mr. Zuckerberg.

And you know, “Don't eat the fruit of this tree” is the only regulation that was ever initiated before people started abusing freedom. Since then, millions of regulations, laws and rules have been created in response to an abuse of freedom. Oftentimes, that response is a — is more extreme than the abuse, and that's what I fear could happen, based on some of the things I've heard today in response to this.

So this national discussion is very important. First of all, it's not — not only for these two days, but that it continues, lest we over-respond, Okay?

Now, that said, I think that the consumer and industry, and whatever industry it is, your company or others — others like yours, share that responsibility. So I appreciate both your patience and your preparation coming in today.

But in response to the questions from a few my colleagues related to the — the illegal drug ads, I have to admit that there were times when I was thinking, “His answers aren't very reassuring to me.”

And I'm wondering what your answer would be as to how quickly you could take down an illegal drug site, if there was a million-dollar per post, per day regulation fine tied to it. In other words, give it your best. I mean, don't wait for somebody to flag it. Look for it. Make it a priority. It's certainly far more dangerous than a couple of conservative Christian women on — on TV. So please, be better than this.

ZUCKERBERG: Congressman, I agree that this is very important, and I — I miscommunicated if I left the impression that we weren't proactively going to work on tools to take down this content, and we're only going to rely on people to flag it for us.

Right now, I think underway, we have efforts to focus not only on ads, which has been most of the — the majority of the questions, but a lot of people share this stuff in groups, too, and the — the free part of the products that aren't paid, and we need to get that content down, too.

I understand how big of an issue this is. Unfortunately, the enforcement isn't — isn't perfect. We do need to make it more proactive, and I'm committed to doing that.

CRAMER: And I don't expect it to be perfect, but I do expect it to be a higher priority than conservative thought. Speaking of that, I think in — in some of your responses to Senator Cruz yesterday, and some responses today, related to liberal bias, you've — you've sort of implied the fact that while you have these 20,000 enforcement folks, you've implied that the Silicon Valley — perhaps this was more yesterday — that Silicon Valley is a very liberal place, and so the talent pool perhaps leans left, and it's biased.

Let me suggest that you look someplace perhaps in the middle of the North American continent for some people, maybe even your next big investment of — of capital could be in — in some place like, say, Bismarck, North Dakota, or Williston, where you have visited, where people tend to be pretty common sense, and probably, perhaps, even more diverse than Facebook in — in some respects. If the talent pool is a problem, then let's look for a different talent pool, and maybe we can even have a nice, big center someplace.

I want to then close with this, because you testified yesterday, and the opening statement by the ranking member of the committee bothered me, in that suddenly there is this great concern that the providers, particularly Facebook, other large ads providers, and — and content providers should be hyper-regulated, when all along, we — we, as Republicans, have been talking about net neutrality.

We — we talked about earlier this year, when we — or last year, when we rolled back the Internet service provider privacy stuff that seemed tilted heavily in your favor, and against them. Don't you think that ubiquitous platforms like Google, and Facebook, and — and many others have — should have the same responsibility to privacy as an Internet service provider?

ZUCKERBERG: Congressman, let me answer that in a second, and before — before I get to that, on your last point, the content reviewers who we have are not primarily located in — in — in Silicon Valley. So I think that — that's — that was an important point, and ...

CRAMER: It is.

ZUCKERBERG: ... I do worry about the general bias of people in Silicon Valley. But the — the majority of the folks doing content review are — are around the world in different places.

To your question about net neutrality, I think that there's a big difference between Internet service providers and platforms on top of them. And the big reason is that, well, I just think about my own experience.

When I was starting Facebook, I had one choice of an Internet service provider. And if I had to potentially pay extra in order to make it so that people could have Facebook as an option for something that they used, then I'm not sure that we'd be here today. Platforms, there are just many more.

So it may be true that a lot of people choose to use Facebook. The average American, I think, uses about eight different communication and social network apps to stay connected to people. And just as clearly correct or true that there are more choices on platforms. So even though they can reach large-scale, I think the pressure of just having one or two in a place does require us to think a little bit ...

CRAMER: I would submit to you that I have fewer choices in — on the platform, in — in your type of a platform, than they do Internet service providers, even in rural North Dakota.

With that, thank you, Mr. Chairman.

WALDEN: I suppose you don't want to hang around for another round of questions? Just kidding. Mr. Zuckerberg ...

CRAMER: Isn't he funny?

WALDEN: Staff, several of them, just passed out behind you.

You know, on a serious note as we close, I would welcome your suggestions of other technology CEOs we might benefit from hearing from in the future for a hearing on these issues, as we look at net neutrality, as we looked at privacy issues. These are all important. They are very controversial. We're fully cognizant of that. We want to get it right, and — and so we appreciate your comments and — and testimony today.

There are no other members that haven't asked you questions, and we're not doing a second round, so seeing that, I just want to thank you for being here. I know we agreed to be respectful of your time. You have been respectful of our questions, and we appreciate your answers and your candor.

As you know, some of our members weren't able to ask all the questions they had, so they'll probably submit those in — in writing, and we would — we would like to get answers to those back in a timely manner.

I'd also like to include the following documents be submitted for the record by unanimous consent: a letter from the American Civil Liberties Union, a letter from NetChoice, a letter from the Vietnam Veterans of America, which I referenced in my opening remarks.

A letter from Public Knowledge, a letter and an FTC complaint from Electronic Privacy Information Center, a letter from the Motion Picture Association of America, a letter from ACT, the App Association, a letter from the Committee for Justice, a letter from the Transatlantic Consumer Dialogue, and a letter from the Civil Society Groups, and a letter from the National Council of Negro Women.

Pursuant to committee rules, I remind members they have 10 business days to submit additional questions for the record, and I ask that the witness submit their responses within 10 business days upon receipt of those questions. Without objections, our — our committee is now adjourned.

List of Panel Members and Witnesses

PANEL MEMBERS:

REP. GREG WALDEN, R-ORE., CHAIRMAN

REP. FRED UPTON, R-MICH.

REP. JOE L. BARTON, R-TEX.

REP. JOHN SHIMKUS, R-ILL.

REP. MICHAEL C. BURGESS, R-TEX.

REP. MARSHA BLACKBURN, R-TENN.

REP. STEVE SCALISE, R-LA.

REP. ROBERT E. LATTA, R-OHIO

REP. CATHY MCMORRIS RODGERS, R-WASH.

REP. BRETT GUTHRIE, R-KY.

REP. LEONARD LANCE, R-N.J.

REP. GREGG HARPER, R-MISS.

REP. PETE OLSON, R-TEX.

REP. H. MORGAN GRIFFITH, R-VA.

REP. ADAM KINZINGER, R-ILL.

REP. DAVID B. MCKINLEY, R-W.VA.

REP. GUS BILIRAKIS, R-FLA.

REP. BILL JOHNSON, R-OHIO

REP. BILLY LONG, R-MO.

REP. LARRY BUCSHON, R-IND.

REP. BILL FLORES, R-TEX.

REP. SUSAN W. BROOKS, R-IND.

REP. MARKWAYNE MULLIN, R-OKLA.

REP. RICHARD HUDSON, R-N.C.

REP. CHRIS COLLINS, R-N.Y.

REP. KEVIN CRAMER, R-N.D.

REP. TIM WALBERG, R-MICH.

REP. MIMI WALTERS, R-CALIF.

REP. RYAN A. COSTELLO, R-PA.

REP. E.L. “BUDDY” CARTER, R-GA.

REP. JEFF DUNCAN, R-S.C.

REP. FRANK PALLONE JR., D-N.J., RANKING MEMBER

REP. BOBBY L. RUSH, D-ILL.

REP. ANNA G. ESHOO, D-CALIF.

REP. ELIOT L. ENGEL, D-N.Y.

REP. GENE GREEN, D-TEX.

REP. DIANA DEGETTE, D-COLO.

REP. MIKE DOYLE, D-PA.

REP. JAN SCHAKOWSKY, D-ILL.

REP. G.K. BUTTERFIELD, D-N.C.

REP. DORIS MATSUI, D-CALIF.

REP. KATHY CASTOR, D-FLA.

REP. JOHN SARBANES, D-MD.

REP. BEN RAY LUJÁN, D-N.M.

REP. PAUL TONKO, D-N.Y.

REP. JERRY MCNERNEY, D-CALIF.

REP. PETER WELCH, D-VT.

REP. YVETTE D. CLARKE, D-N.Y.

REP. TONY CÁRDENAS, D-CALIF.

REP. JOSEPH P. KENNEDY III, D-MASS.

REP. DAVID LOEBSACK, D-IOWA

REP. KURT SCHRADER, D-ORE.

REP. DEBBIE DINGELL, D-MICH.

REP. SCOTT PETERS, D-CALIF.

REP. RAUL RUIZ, D-CALIF.

WITNESSES:

FACEBOOK CEO MARK ZUCKERBERG TESTIFIES

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### White House declares Islamic State 100 percent defeated in Syria - The Washington Post

White House declares end to Islamic State, but fighting grinds on

U.S.-backed forces have pushed the Islamic State out of its final foothold in Syria, the White House said Friday, making a long-awaited victory announcement but defying eyewitness accounts of continued fighting.

Speaking to reporters aboard Air Force One, White House press secretary Sarah Sanders said the group’s “territorial caliphate has been eliminated in Syria.”

Trump, making brief remarks to reporters after landing in Palm Beach, Fla., showed reporters a map comparing Iraq and Syria at the height of Islamic State power in 2014 with today.

“That’s what we have right now,” he said, indicating areas no longer controlled by the militants.

The announcement, more than four years after the United States launched its first airstrikes against the then-formidable militant group, follows months of speculation about when U.S.-backed Syrian forces would capture the Islamic State’s final foothold in eastern Syria.

Neighboring Iraq declared victory over the group in late 2017.

But the White House statements were immediately contradicted by reports from eyewitnesses and local forces in eastern Syria, where the U.S.-backed ­Syrian Democratic Forces (SDF) have struggled to root out militant holdouts who are dug in among civilians.

Mustafa Bali, a spokesman for the SDF, said the fighting had not eased up around the village of Baghouz, which has been the scene of an intense battle against those holdouts.

“Heavy fighting continues around mount #Baghouz right now to finish off whatever remains of ISIS,” he said in a message on Twitter.

A U.S. military official, speaking on the condition of anonymity because he was not authorized to comment publicly, said the SDF was still working “to clear pockets of ISIS from caves under Baghouz.”

The official said there appeared to be a few hundred militants remaining around Baghouz.

Photographs from the area showed the night sky lit up with tracer rounds.

The militants appeared to be pinned down along a cliff near the Euphrates River as they mount a desperate final stand.

More than 50,000 people have left the enclave since January, surprising military planners who have repeatedly believed the area to be almost empty.

On Thursday, the International Rescue Committee said that thousands more civilians could follow in the coming days.

“These women and children are in the worst condition we have seen since the crisis first began,” said Wendy Taeuber, the group’s Iraq and northeast Syria country director.

The Pentagon did not immediately provide an explanation for the apparent disconnect between the White House depiction and reports from eastern Syria.

Trump, who has been eager to end the U.S. military mission in Syria, has repeatedly suggested in recent months that a final victory was imminent, only to have the fighting drag on.

In December, Trump made another victory declaration as he announced, in a surprise move, that he would pull out all 2,000 U.S. troops from Syria.

In the following weeks, the president appeared to back away from that victory claim as top advisers warned that an abrupt departure from Syria would alienate allies and jeopardize gains against the militants.

The Pentagon now plans to keep at least 400 troops in Syria to help the SDF and other allies maintain security in former Islamic State strongholds.

While a conclusion to the operation would be a milestone for the Pentagon, officials expect the group will seek to mount continued insurgent attacks in Syria, as it has in Iraq.

Sanders said Trump had been briefed during his flight by acting defense secretary Patrick Shanahan.

Shanahan joins Trump at his exclusive Mar-a-Lago resort as the president considers nominating the former Boeing executive to the top Pentagon job.

It was not immediately clear whether Shanahan conveyed to Trump that the Islamic State had been ejected from Baghouz, or whether Trump or Shanahan were aware of the assessment from Syrian and U.S. forces in the region.

Loveluck reported from London. John Wagner in Washington contributed to this report.

#####EOF##### Russia’s attempt to hack voting systems shows that our elections need better security - The Washington Post
PostEverything

Russia’s attempt to hack voting systems shows that our elections need better security

Even failed attacks can sow doubt about our democracy.

Bruce Schneier is a security technologist and a lecturer at the Kennedy School of Government at Harvard University. His new book, "Click Here to Kill Everybody," will be published in September.

This week brought new public evidence about Russian interference in the 2016 election. On Monday, the Intercept published a top-secret National Security Agency document describing Russian hacking attempts against the U.S. election system. While the attacks seem more exploratory than operational — and there’s no evidence that they had any actual effect — they further illustrate the real threats and vulnerabilities facing our elections, and they point to solutions.

The document describes how the GRU, Russia’s military intelligence agency, attacked a company called VR Systems that, according to its website, provides software to manage voter rolls in eight states. The August 2016 attack was successful, and the attackers used the information they stole from the company’s network to launch targeted attacks against 122 local election officials on Oct. 27, 12 days before the election.

That is where the NSA’s analysis ends. We don’t know whether those 122 targeted attacks were successful, or what their effects were if so. We don’t know whether other election software companies besides VR Systems were targeted, or what the GRU’s overall plan was — if it had one. Certainly, there are ways to disrupt voting by interfering with the voter registration process or voter rolls. But there was no indication on Election Day that people found their names removed from the system, or their address changed, or anything else that would have had an effect — anywhere in the country, let alone in the eight states where VR Systems is deployed. (There were Election Day problems with the voting rolls in Durham, N.C. — one of the states that VR Systems supports — but they seem like conventional errors and not malicious action.)

And 12 days before the election (with early voting already well underway in many jurisdictions) seems far too late to start an operation like that. That is why these attacks feel exploratory to me, rather than part of an operational attack. The Russians were seeing how far they could get, and keeping those accesses in their pocket for potential future use.

Presumably, this document was intended for the Justice Department, including the FBI, which would be the proper agency to continue looking into these hacks. We don’t know what happened next, if anything. VR Systems isn’t commenting, and the names of the local election officials targeted did not appear in the NSA document.

So while this document isn’t much of a smoking gun, it’s yet more evidence of widespread Russian attempts to interfere last year.

The document was, allegedly, sent to the Intercept anonymously. An NSA contractor, Reality Leigh Winner, was arrested Saturday and charged with mishandling classified information. The speed with which the government identified her serves as a caution to anyone wanting to leak official U.S. secrets.

The Intercept sent a scan of the document to another source during its reporting. That scan showed a crease in the original document, which implied that someone had printed the document and then carried it out of some secure location. The second source, according to the FBI’s affidavit against Winner, passed it on to the NSA. From there, NSA investigators were able to look at their records and determine that only six people had printed out the document. (The government may also have been able to track the printout through secret dots that identified the printer.) Winner was the only one of those six who had been in email contact with the Intercept. It is unclear whether the email evidence was from Winner’s NSA account or her personal account, but in either case, it’s incredibly sloppy tradecraft.

With President Trump’s election, the issue of Russian interference in last year’s campaign has become highly politicized. Reports like the one from the Office of the Director of National Intelligence in January have been criticized by partisan supporters of the White House. It’s interesting that this document was reported by the Intercept, which has been historically skeptical about claims of Russian interference. (I was quoted in their story, and they showed me a copy of the NSA document before it was published.) The leaker was even praised by WikiLeaks founder Julian Assange, who up until now has been traditionally critical of allegations of Russian election interference.

This demonstrates the power of source documents. It’s easy to discount a Justice Department official or a summary report. A detailed NSA document is much more convincing. Right now, there’s a federal suit to force the ODNI to release the entire January report, not just the unclassified summary. These efforts are vital.

This hack will certainly come up at the Senate hearing where former FBI director James B. Comey is scheduled to testify Thursday. Last year, there were several stories about voter databases being targeted by Russia. Last August, the FBI confirmed that the Russians successfully hacked voter databases in Illinois and Arizona. And a month later, an unnamed Department of Homeland Security official said that the Russians targeted voter databases in 20 states. Again, we don’t know of anything that came of these hacks, but expect Comey to be asked about them. Unfortunately, any details he does know are almost certainly classified, and won’t be revealed in open testimony.

But more important than any of this, we need to better secure our election systems going forward. We have significant vulnerabilities in our voting machines, our voter rolls and registration process, and the vote tabulation systems after the polls close. In January, DHS designated our voting systems as critical national infrastructure, but so far that has been entirely for show. In the United States, we don’t have a single integrated election. We have 50-plus individual elections, each with its own rules and its own regulatory authorities. Federal standards that  mandate voter-verified paper ballots and post-election auditing would go a long way to secure our voting system. These attacks demonstrate that we need to secure the voter rolls, as well.

Democratic elections serve two purposes. The first is to elect the winner. But the second is to convince the loser. After the votes are all counted, everyone needs to trust that the election was fair and the results accurate. Attacks against our election system, even if they are ultimately ineffective, undermine that trust and — by extension — our democracy. Yes, fixing this will be expensive. Yes, it will require federal action in what’s historically been state-run systems. But as a country, we have no other option.

CORRECTION: An earlier version of this story misstated the number of days before the election the Russian hack attempt began. It was 12 days before the election, not five. Also, the story incorrectly described the time between when the NSA document was produced and when former FBI director James B. Comey left the bureau. It was dated shortly before he left, not well after.

Read more:

Why you should side with Apple, not the FBI, in the San Bernardino iPhone case

The next ransomware attack will be worse than WannaCry

How to keep your private conversations private for real

Most Read Opinions
#####EOF##### Facebook needs to do more than come clean and apologize - The Washington Post
The Post's View

Facebook needs to do more than come clean and apologize


Cardboard cutouts of Facebook founder and chief executive Mark Zuckerberg were placed outside the U.S. Capitol in Washington on April 10, the day Zuckerberg testified before Congress. (Saul Loeb/AFP/Getty Images)

EVERY EXPLOSIVE report on Facebook’s data-dealing in recent months is really part of the same story. Facebook wanted to connect the world, and it also wanted to make money. To do both, it decided to connect itself to the rest of the Web — by sharing user information from firm to firm. At some point, Facebook discovered that users did not want their data shared so widely and that regulators also objected. The company changed its policies. But its response was halfhearted and uneven. To recover trust now, it must do more than come clean and apologize.

The latest investigation from the New York Times chronicling Facebook’s privacy faux pas describes partnerships Facebook formed with other technology corporations. The agreements range from deals with device-makers integrating Facebook with their systems, to an “instant personalization” experiment that let sites tailor their displays to users’ public profiles, to arrangements that allowed companies such as Netflix and Spotify to read some private messages.

The deals helped Facebook extend its reach across the Internet, and data the company gained in return could have helped it improve its then-fledgling targeted-advertising system — which has since become a golden goose. The partners benefited, too, from shiny features and from information that helped them better understand their own audiences.

Facebook says services that received private data without explicit consent could use that information only to “recreate the Facebook experience” — making them simple extensions of its own social network. This argument is plausible for some partnerships, such as with device-makers. For others, it is less persuasive. And that is only one type of agreement described in the Times article; Netflix and Spotify, for example, had broader permissions to handle data but also provided more notice. The bottom line, however, is clear: Consumers often did not know what information Facebook was giving away, to whom or for how long.

Facebook evidently thought in its earlier days that it could share whatever data it wanted without anyone protesting. The company eventually learned different, including from a Federal Trade Commission investigation that resulted in a 2011 consent decree. And over time, it rolled back many of these privacy-violating features — but messily. Most troubling, Facebook continued to give some companies more access to data than its public pronouncements suggested.

D.C.’s attorney general announced Wednesday that the District would sue Facebook, marking the first U.S. regulatory action in response to its Cambridge Analytica scandal. It probably won’t be the last. But Facebook’s future will depend on more than the outcome of any court case. The company faces a trust deficit that grows with every story of apparent negligence.

Facebook has to come clean about exactly what it has shared in the past and what it is sharing now. But that’s no longer sufficient. The best way to regain trust now would be to endorse a federal privacy law — a real one. That requires more than pushing for mushy principles that every tech company seems to say it supports, and more than advocating the loosest possible framework to preempt California’s stricter regulations. It requires accepting and supporting a future in which users really control their own data.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### U.S. military to test missiles banned under faltering nuclear pact with Russia - The Washington Post

U.S. military to test missiles banned under faltering nuclear pact with Russia


Secretary of State Mike Pompeo announced that the U.S. withdrawal from the INF Treaty with Russia on Feb. 1, 2019. (Eric Baradat/AFP/Getty Images)

The Pentagon is gearing up to test missiles banned by a Cold War-era arms control pact with Russia that is set to end formally this summer after President Trump’s withdrawal over Russian violations.

The U.S. military plans to test a ground-launched cruise missile with a range of about 600 miles in August and a midrange ballistic missile with a range of about 1,800 to 2,500 miles in November, according to senior U.S. defense officials who spoke on the condition of anonymity to discuss sensitive military matters. 

The testing, production and deployment of missiles with those ranges is prohibited by the Intermediate Range Nuclear Forces Treaty, or the INF Treaty. But Trump withdrew from the treaty on Feb. 1 and triggered a formal six-month wait period before the final expiry of the agreement this summer.

Washington and Moscow will then be free to test, produce and deploy the intermediate-range missiles that both countries have agreed to ban for more than three decades. Research and development of the banned missiles isn’t prohibited by the treaty.

Russia suspended its participation in the treaty after Trump’s withdrawal. Russian President Vladi­mir Putin vowed to design new weapons banned under the pact but said he would deploy them only if the United States does.

Washington has said Moscow is already deploying a missile that violates the agreement and cited that weapon as a reason for its withdrawal from the pact. The Kremlin has denied that accusation.

The race to develop new intermediate-range missiles banned by the treaty raises concerns about a new nuclear arms race with Russia as an arms-control framework constructed during the Cold War shows increasing signs of eroding. The senior U.S. defense officials cautioned that the United States was looking at only conventional variants of the new missiles slated for testing later this year. Theoretically, in the future they could be armed with nuclear warheads.

Signed in 1987 by President Ronald Reagan and Soviet President Mikhail Gorbachev, the INF Treaty was widely viewed as a breakthrough in arms control. The pact banned all ground-launched missiles, both nuclear and nonnuclear, with ranges from 310 to 3,400 miles. It ended a particularly tense period in the Cold War arms race, in which Washington and Moscow dotted Europe with nuclear-tipped rockets.

U.S. officials say the Trump administration has no plans to seek the forward deployment of nuclear missiles in Europe once again, but the breakdown of the treaty threatens a return to an era in which Europeans worried about Russian nuclear missiles that could strike their cities within a few minutes of launching. The systems the Pentagon is planning to test are similar to the missiles that the United States deployed in the 1980s, although without nuclear warheads attached. The deployment of those missiles fueled tension with the Soviet Union that ultimately led to the conclusion of the INF Treaty.

The U.S. ground-launched cruise missile is slated for testing in August, just after the treaty formally ends. According to a senior defense official, it will essentially involve putting a Tomahawk missile in a container that could be placed on a ship or in a mobile launcher.

“We’ll actually launch it, and it’ll fly out, and we’ll prove the concept — that you can take a Tomahawk and put it on a truck,” the senior defense official said. Deployment of the mobile missile would require procuring the system and training and equipping the forces that operate it. The official said that could take place within 18 months. 

Washington has not spoken to any European or Asian allies about the possibility of hosting the missile on their territory, according to the defense officials. The U.S. military could keep it in its arsenal at home for possible deployment if a situation warranted.

“We haven’t engaged any of our allies about formal deployment,” the senior official said. “But it’s always going to be deployable.” Asked about a possible forward deployment, the official added, “We are far away from that consideration.”

The United States previously deployed a mobile ground-launched cruise missile known as the BGM-109G Gryphon in Europe during the Cold War, but the Pentagon withdrew the weapon as a result of the INF Treaty’s restrictions.

The intermediate-range ballistic missile that the Pentagon is planning to test in November is a much longer-term effort. The test comes as the Army also explores developing longer-range missiles. If the proof of concept works in November, then the Army would develop, procure and roll out the system, according to the senior defense official, who predicted that process would take no less than five years. 

The official said the missile was different from the Army Tactical Missile System, and would more closely resemble the Pershing II ballistic missiles that the United States deployed at the end of the Cold War in the years before the signing of the INF Treaty. 

 “It’s a brand-new missile,” the senior defense official said. “Think Pershing II. It’s a missile of that class.”

Both the Obama administration and the Trump administration urged Russia to come back into compliance with the INF Treaty and end the production and deployment of its banned intermediate-range missile. Russia denied the allegations, and instead accused the United States of violating the pact through its missile defense installations in Europe — accusations the State Department refuted.

The senior defense official said the Pentagon would stand down on the tests if Russia were to come back into compliance and the treaty survived. “If the Russians come back in, in August we wouldn’t do the test,” the official said.

#####EOF##### How the Supreme Court could keep police from using your cellphone to spy on you - The Washington Post
PostEverything

How the Supreme Court could keep police from using your cellphone to spy on you


The Supreme Court will consider whether police must have a warrant to track your movements by using cellphone records. (AP Photo)
Bruce Schneier is a security technologist and a lecturer at the Kennedy School of Government at Harvard University. His new book, "Click Here to Kill Everybody," will be published in September.

The cellphones we carry with us constantly are the most perfect surveillance device ever invented, and our laws haven’t caught up to that reality. That might change soon.

This week, the Supreme Court will hear a case with profound implications on your security and privacy in the coming years. The Fourth Amendment’s prohibition of unlawful search and seizure is a vital right that protects us all from police overreach, and the way the courts interpret it is increasingly nonsensical in our computerized and networked world. The Supreme Court can either update current law to reflect the world, or it can further solidify an unnecessary and dangerous police power.

The case centers on cellphone location data and whether the police need a warrant to get it, or if they can use a simple subpoena, which is easier to obtain. Current Fourth Amendment doctrine holds that you lose all privacy protections over any data you willingly share with a third party. Your cellular provider, under this interpretation, is a third party with whom you’ve willingly shared your movements, 24 hours a day, going back months — even though you don’t really have any choice about whether to share with them. So police can request records of where you’ve been from cell carriers without any judicial oversight. The case before the court, Carpenter v. United States, could change that.

Traditionally, information that was most precious to us was physically close to us. It was on our bodies, in our homes and offices, in our cars. Because of that, the courts gave that information extra protections. Information that we stored far away from us, or gave to other people, afforded fewer protections. Police searches have been governed by the “third-party doctrine,” which explicitly says that information we share with others is not considered private.

The Internet has turned that thinking upside-down. Our cellphones know who we talk to and, if we’re talking via text or email, what we say. They track our location constantly, so they know where we live and work. Because they’re the first and last thing we check every day, they know when we go to sleep and when we wake up. Because everyone has one, they know whom we sleep with. And because of how those phones work, all that information is naturally shared with third parties.

More generally, all our data is literally stored on computers belonging to other people. It’s our email, text messages, photos, Google docs, and more — all in the cloud. We store it there not because it’s unimportant, but precisely because it is important. And as the Internet of Things computerizes the rest our lives, even more data will be collected by other people: data from our health trackers and medical devices, data from our home sensors and appliances, data from Internet-connected “listeners” like Alexa, Siri and your voice-activated television.

All this data will be collected and saved by third parties, sometimes for years. The result is a detailed dossier of your activities more complete than any private investigator — or police officer — could possibly collect by following you around.

The issue here is not whether the police should be allowed to use that data to help solve crimes. Of course they should. The issue is whether that information should be protected by the warrant process that requires the police to have probable cause to investigate you and get approval by a court.

Warrants are a security mechanism. They prevent the police from abusing their authority to investigate someone they have no reason to suspect of a crime. They prevent the police from going on “fishing expeditions.” They protect our rights and liberties, even as we willingly give up our privacy to the legitimate needs of law enforcement.

The third-party doctrine never made a lot of sense. Just because I share an intimate secret with my spouse, friend or doctor doesn’t mean that I no longer consider it private. It makes even less sense in today’s hyper-connected world. It’s long past time the Supreme Court recognized that a months’-long history of my movements is private, and my emails and other personal data deserve the same protections, whether they’re on my laptop or on Google’s servers.

Read more:

Your WiFi-connected thermostat can take down the whole Internet. We need new regulations.

Hackers don’t want to crash stock exchanges. They want to make money off them.

#####EOF##### Nuclear regulators were unaware of transfer of sensitive technical information to Saudi Arabia - The Washington Post

Nuclear regulators were unaware of transfer of sensitive technical information to Saudi Arabia


A Saudi flag is seen at a conference in Baku, Azerbaijan, on March 18, 2019. (Mladen Antonov/AFP/Getty Images)

When the Trump administration on seven occasions authorized companies to share sensitive nuclear energy information with Saudi Arabia, it was supposed to consult with several agencies, including the independent Nuclear Regulatory Commission.

But NRC Chairman Kristine L. Svinicki testified before the Senate Environment and Public Works Committee on Tuesday that she did not know whether the agency had been consulted, and if so whether it had raised any concerns.

At one point Sen. Chris Van Hollen (D-Md.) asked four questions in a row about the agency’s participation, pausing after each one, and Svinicki and her four fellow commissioners remained silent.

“I know you don’t have sign-off authority, but none of you at this table know whether the NRC raised any concerns about entering in these 810 authorizations?” he asked.

“I do not,” Svinicki replied.

The term “Part 810 authorizations” refers to permission given to share technological information but not pieces of equipment. For that, companies need a different approval under a 123 Agreement, which the United States and Saudi Arabia have not agreed on.

The exchange between Van Hollen and Svinicki illustrates growing concern in Congress over the Energy Department’s authorization of Part 810 information — nonclassified but sensitive details about nuclear energy reactors U.S. companies are trying to sell to Saudi Arabia.

Last week, the administration divulged that it had kept secret from Congress as well as the public seven authorizations for nuclear energy companies to use in wooing Saudi Arabia, a potential customer interested in building two nuclear reactors for civilian purposes. The information kept under wraps includes the identity of the companies and the type of information.

In the past, that information has been placed in the Energy Department’s reading room. But Energy Secretary Rick Perry said the companies had asked for confidentiality because of proprietary information.

Rep. Brad Sherman (D-Calif.) on Tuesday sent a letter to Perry demanding that the administration share more information about the Part 810 disclosures with the House Foreign Affairs subcommittee on Asia, the Pacific and nonproliferation.

“I fully understand and respect the need for U.S. companies to protect their proprietary information from competitors,” Sherman wrote. “At the same time, however, Congress must be given sufficient information to fulfill its constitutional oversight responsibilities.”

He noted that the Atomic Energy Act “stated in a number of clauses that the executive branch must keep Congress ‘fully and currently informed.’ ”

In the Senate hearing Van Hollen said, “You have a statutory and regulatory role to play here, and I’ve got to say it’s astounding that not a single one of you is aware of whether, when and what role the NRC played in that particular authorization.”

Members of Congress also are seeking information about when the 810 approvals were issued, an effort to determine whether the administration issued any of them after the killing of Washington Post contributing columnist Jamal Khashoggi in the Saudi Consulate in Istanbul.

Van Hollen pressed Svinicki on that timing, just as lawmakers had pressed Perry on timing last week.

“I don’t have that answer for you today, senator,” Svinicki said Tuesday. “I would need to get back to you.”

In an interview later, Van Hollen said that administration officials “appear willing to short-circuit the process to achieve their political goal of continuing to cozy up to the Saudi regime.” He added, “at the very least it is clearly unwilling to stand up to the Saudis on human rights while at the same time bending over backwards to give the Saudis access to nuclear material and technology.”

#####EOF##### Lawsuit: Women accuse Sharp Grossmont Hospital of secretly recording them during medical procedures - The Washington Post

Hidden hospital cameras filmed women during childbirth and miscarriage procedures, lawsuit says


(iStock/iStock)

A California women’s hospital is accused of placing hidden cameras in operating rooms, where it secretly recorded scores of patients in stirrups during intimate medical procedures, including treatment after miscarriages, a lawsuit alleges.

The lawsuit, filed last week, claims the Women’s Health Center at Sharp Grossmont Hospital invaded about 1,800 patients’ privacy over an 11-month period starting in summer 2012, filming them in vulnerable positions — sometimes unconscious, sometimes only partially robed and sometimes with their faces or genitals exposed.

These women were reportedly undergoing procedures such as Caesarean sections, hysterectomies, sterilizations and dilation and curettages (D&Cs) after miscarriages.

“It was a highly stressful and emotional time for my family and my doctor. No one ever asked me to record one of my most tender, life-changing moments,” one of the patients, Melissa Escalera, told NBC San Diego about the secretly recorded moment in September 2012 that her daughter was delivered via an emergency C-section. “I would have never agreed to be recorded in that vulnerable moment.”

Sharp Grossmont Hospital’s parent company, Sharp HealthCare, said in a statement this week that from July 2012 to June 2013, computer monitors with motion-activated cameras were set up to record in three operating rooms in the women’s health center as part of an investigation into medications that had gone missing from drug carts.

“Although the cameras were intended to record only individuals in front of the anesthesia carts removing drugs, others, including patients and medical personnel in the operating rooms, were at times visible to the cameras and recorded,” Sharp HealthCare said in the statement. It said it could not comment further on the case because of pending litigation but added: “We sincerely regret that our efforts to ensure medication security may have caused any distress to those we serve.”

More than 80 women claim in the lawsuit that they were filmed without their knowledge or consent at the hospital in La Mesa, outside San Diego.

But more than that, the women say, the hospital was “grossly negligent” in how it stored their most personal and private moments — on desktop computers that numerous users could access, some without password protection.

“It’s horrifying to think that, especially in today’s day and age of the ubiquity of videos on the Internet, if one of those videos were to get in the wrong hands, there’s no controlling it. It takes your own medical care outside your own control,” Allison Goddard, an attorney for the women, told CNN. She could not immediately be reached by The Washington Post.

The lawsuit stated that the cameras had been installed on the drug carts to determine whether an employee was stealing propofol, a powerful sedative.

But according to the lawsuit, they “were set up to record when any person entered an operating room, to record a wide range of activity in the operating room beyond access to the drug cart, and to continue recording even after motion stopped,” meaning about 1,800 patients were recorded during that time.

“The cameras captured images of patients entering the operating rooms, being moved onto surgery tables and exiting. Because of the angle and placement of the cameras, patients’ faces were recorded, and the patients were identifiable,” according to the lawsuit.

The lawsuit added that the recordings showed patients “conscious and unconscious, partially robed on operating room tables, undergoing medical procedures and communicating with their doctors and medical personnel. Because of the nature of these procedures, the recordings captured women while they were emotionally and physically exposed, and at their most vulnerable.”

And, according to the lawsuit, at times their “most sensitive genital areas” were visible to the camera.

Art Caplan, a professor of bioethics and head of the division of medical ethics at New York University School of Medicine, acknowledged that drug diversion in hospitals is a “huge, huge problem” but said investigations that involve patients, their identities and their private information must be handled with care and under the proper authorities.

“You better have a pretty good rationale for taping anybody,” Caplan said, adding that patients in such situations are “completely vulnerable,” often in pain and “unable to look out for their own interests.” And, he said, any sensitive material that is obtained, such as video, would need to be tightly secured; otherwise, it would violate HIPAA (Health Insurance Portability and Accountability Act), which protects patients’ personal information.

Caplan called the Sharp investigation “amateurish,” saying it was not handled ethically and that it compromised fundamental privacy rights.

“If you’re going to do a serious police investigation, you have to do a serious police investigation,” he said.

Read more:

Anti-vaxxers have been raising money with GoFundMe. The site just put a stop to it.

A mom gave birth to a 15-pound baby and ‘felt like I had been hit by two tractor-trailers’

Lawmaker promoting anti-vaxx bill suggests measles can be treated with antibiotics. (It can’t.)

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read National
Read content from allstate
Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
We went to the source. Here’s what matters to millennials.
A state-by-state look at where Generation Y stands on the big issues.
#####EOF##### Ann E. Marimow - The Washington Post

Ann E. Marimow

Washington, D.C.

Reporter covering legal affairs Education: Cornell University
 Ann Marimow writes about legal issues for The Washington Post, primarily from the federal courts in the Washington region. She previously covered state government and politics in California, New Hampshire and Maryland. She joined The Post in 2005. 
Honors & Awards:
  • Nieman fellow, Harvard University
Foreign languages spoken: Spanish
Latest from Ann E. Marimow

The First Amendment bans “viewpoint discrimination” by the government, but Justice Department lawyers contend the account is a personal platform.

  • Mar 26, 2019

The president landed in court over his blocking of critics, and elected officials across the country are getting the legal message.

  • Mar 25, 2019

Some presidential contenders demand that the Justice Dept. make public the underlying evidence.

  • Mar 24, 2019

An appeals panel heard arguments over the emoluments clauses in a case brought by the D.C. and Maryland attorneys general.

  • Mar 19, 2019

The former campaign chairman for President Trump apologized Wednesday for his crimes at a hearing in Washington.

  • Mar 13, 2019

The revamping of the disciplinary system follows sexual misconduct claims against former appeals-court judge Alex Kozinski.

  • Mar 12, 2019

The order issued in Maryland followed the Supreme Court’s decision in January to lift nationwide injunctions imposed by judges in other cases fighting the policy.

  • Mar 7, 2019

Prosecutors said he “corrupted” two city agencies and undermined companies that followed the rules in competing for city contracts.

  • Mar 4, 2019

The ruling could have implications for organizations involved in financing overseas development.

  • Feb 27, 2019

  • Aug 6, 2018
Load More
#####EOF##### Facebook deserves criticism. The country deserves solutions. - The Washington Post
The Post's View

Facebook deserves criticism. The country deserves solutions.


Facebook chief executive Mark Zuckerberg at F8, Facebook's developer conference, in San Jose on May 1. (Marcio Jose Sanchez/AP)

WHAT HAPPENS now? That is the essential question following the New York Times’s troubling investigation into Facebook’s response to Russian interference on its platform. The article has prompted sharp criticism of the company from all quarters, and Facebook deserves the blowback. But Americans deserve solutions. There are a few places to start.

Facebook has disputed the Times’s characterization of its efforts to combat Moscow’s meddling, which the paper reports were worse than insufficient. According to the Times, executives not only sought to play down the degree to which Russians had used Facebook to manipulate American voters but also embarked on a campaign to discredit critics and turn attention toward other companies.

Most concerning is Facebook’s hiring of a public-relations firm, Definers, that played into conspiracy theories by linking grass-roots opposition to billionaire George Soros — while Facebook also lobbied a Jewish civil rights group to cast other negative rhetoric as anti-Semitic. These tactics exploit one of America’s deepest divides, amplifying the same political polarization Facebook claims it is trying to stop its platform from promoting. (Facebook CEO Mark Zuckerberg says he and chief operating officer Sheryl Sandberg were unaware of Definers’s involvement.)

It is strategies such as these that make it difficult to write off Facebook’s failings as part of a maturation process for the Silicon Valley start-up turned titan. Facebook may finally be wrapping its head around its responsibilities, but it refused to until it began to fear a regulatory onslaught, and even then, it went to great lengths to avoid a reckoning.

This reality underscores the need for Congress to keep pushing, even as Facebook takes strides of its own in policing its platform and keeping the public apprised of what that work looks like. The company announced Thursday that it would expand its appeals process for removals. Facebook will also start demoting “sensationalist and provocative content” that skirts its terms of service but does not violate them. And it will continue releasing transparency reports on the accounts and posts it does end up taking down.

These are all worthy endeavors. Congress’s role is to continue applying pressure to Facebook to ensure the company lives up to its promises — and to step in when it does not. Some areas for intervention are obvious: The Honest Ads Act, currently stalled in the Senate, would stop advertisers from hiding their identities from users. A robust federal privacy law would protect consumers’ data against incursions such as the Cambridge Analytica breach revealed this year.

At the core of the conversation, however, is a stickier subject: speech. There is danger in asking government to weigh in on what civilians can say and where they can say it. But whether Facebook can succeed on its own — without, at least, rules such as those Mr. Zuckerberg himself has proposed requiring companies to report the prevalence of malicious content on their sites and setting thresholds for reduction — seems less certain than ever before. Congress’s role over the next year will be to determine, through ceaseless scrutiny, whether the company is up to the task.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Declassified report says Putin ‘ordered’ effort to undermine faith in U.S. election and help Trump - The Washington Post

Declassified report says Putin ‘ordered’ effort to undermine faith in U.S. election and help Trump


President-elect Donald Trump talks to reporters at Mar-a-Lago on Dec. 28 in Palm Beach, Fla. (Ricky Carioti/The Washington Post)

Russia carried out a comprehensive cyber campaign to sabotage the U.S. presidential election, an operation that was ordered by Russian President Vladi­mir Putin and ultimately sought to help elect Donald Trump, U.S. intelligence agencies concluded in a remarkably blunt assessment released Friday.

The report depicts Russian interference as unprecedented in scale, saying that Moscow’s role represented “a significant escalation in directness, level of activity, and scope of effort” beyond previous election-related espionage.

The campaign initially sought to undermine public faith in the U.S. democratic process, “denigrate” Democratic presidential candidate Hillary Clinton and damage her expected presidency. But in time, Russia “developed a clear preference for President-elect Trump” and repeatedly sought to artificially boost his election chances.

The report released to the public is an abbreviated version of a highly classified multiagency assessment requested by President Obama. Even so, it amounts to an extraordinary postmortem of a Russian assault on a pillar of American democracy.

The 14-page document made public also serves as an explicit rebuttal to Trump’s repeated assertions that U.S. spy agencies cannot determine who was responsible for a hacking operation that extracted thousands of emails from Democratic Party computer networks and dumped them into public view via the WikiLeaks website.

(Video: Peter Stevenson: The Washington Post/Photo: Jabin Botsford/The Washington Post)

In the report, the CIA, FBI and Office of the Director of National Intelligence concluded with “high confidence” that Russian intelligence services penetrated numerous computer systems tied to U.S. political parties and then “relayed” the email troves to WikiLeaks.

Trump emerged from a briefing by the nation’s top intelligence officials on the contents of the report acknowledging at least the possibility that Russia was behind election-related hacks. But he offered no indication that he was prepared to accept their conclusions that Moscow sought to help him win.

Instead, Trump said in a statement that while Russia, China and other countries and groups may have sought to breach Democratic and Republican computer systems, “there was absolutely no effect on the outcome of the election.”

The report did not address that issue. It was presented to Trump by officials including Director of National Intelligence James R. Clapper Jr., CIA Director John Brennan and FBI Director James B. Comey.

Trump also said that “there was no tampering whatsoever with voting machines.” That appeared to be consistent with the findings of the report, although it noted that Russia “obtained and maintained access” to numerous election systems that “were not involved in vote tallying.”

A footnote on the document said that the conclusions contained in the declassified draft were “identical to those in the highly classified assessment but this version does not include the full supporting information on key elements of the influence campaign.”

Obama commissioned the report shortly after the Nov. 8 election, and recently ordered a series of retaliatory measures including new economic sanctions, the expulsion of dozens of suspected Russian intelligence operatives from the United States and the closure of two Russian-owned compounds in the country.

(Dalton Bennett/The Washington Post)

The report was met with mixed reactions from senior lawmakers. Sen. Richard Burr (R-N.C.), the chairman of the Senate Intelligence Committee, described the Russian activities cited in the report as “a troubling chapter in an ongoing story, and I expect that our nation’s leaders will counter these activities appropriately.”

His counterpart in the House, Rep. Devin Nunes (R-Calif.), used the report to criticize Obama, saying that the House Intelligence Committee “has been warning the Obama administration for years about the need for stronger measures against Russia . . . but our warnings largely fell on deaf ears.”

The public version of the report does not explicitly mention some of the most sensitive pieces of intelligence that helped analysts reach their conclusions. U.S. officials have said that spy agencies identified certain “actors” involved in the cyber offensive, believe Russia was far more focused on penetrating and exploiting Democratic systems, and intercepted communications making clear that top Russian officials congratulated themselves on Trump’s win.

One of the report’s key judgments is that “Putin and the Russian government aspired to help President-elect Trump’s election chances when possible by discrediting Secretary Clinton and publicly contrasting her unfavorably to him.”

Moscow did so in part because it “developed a clear preference for President-elect Trump,” who as a candidate repeatedly praised Putin and advocated policies in Syria and Europe strongly favored by the Kremlin.

But the report also attributed Russia’s efforts to Putin’s hostility toward Clinton, a former senator and secretary of state whom he blamed for inciting mass protests against his government in 2011 and 2012.

Overall, the report describes a multipronged campaign that involved not only hacking, but overt propaganda on Russian-controlled news platforms and the extensive use of social media and even “trolls” to amplify voter discord in the United States and encourage opposition to Clinton.

Despite those exertions, Russia appears to have concluded that a Clinton victory was inevitable right up until election night. As a result, Moscow focused on finding ways to undercut Clinton’s legitimacy if she won.

One of the more colorful notes in the report describes how “pro-Kremlin bloggers had prepared a Twitter campaign, #DemocracyRIP, on election night,” then had to shelve it when Trump won.

The document traces interference efforts that began with inconspicuous probes of U.S. electoral systems in early 2014, carried through the election, and may still be underway.

Russian intelligence agencies first gained access to Democratic National Committee networks in July 2015, the report says. Russia’s miliary intelligence service, known as the GRU, “probably” expanded its efforts in March 2016, going after the email accounts of Democratic Party officials and other political figures.

By May, the GRU had stolen what the report describes as “large volumes of data from the DNC.” In the ensuing months, chunks of that trove began to appear on websites including WikiLeaks, generating a steady stream of headlines that embarrassed Democrats and kept voter attention on Clinton’s email controversy.

Putin has repeatedly denied that Russia was responsible for the hacked emails. In an interview with the New York Times on Friday, Trump called the sustained focus on the issue a “political witch hunt.”

The intelligence assessment also drew the most direct line to date between Putin’s desire to aid Trump’s campaign and Russia’s policies and objectives in Syria and Ukraine. In both cases, it said, Putin “indicated a preference for President-elect Trump’s stated policy to work with Russia, and pro-Kremlin figures spoke highly about what they saw as his Russia-friendly positions” in those two countries.

Putin is also eager for relief from economic sanctions imposed on Russia for its support of separatist forces in Ukraine and its annexation of Crimea.

As recently as this week, Trump appeared to be siding with WikiLeaks founder Julian Assange — who has denied that his website got purloined emails from Russia — over the determinations of the CIA and FBI.

The report provides new details about U.S. intelligence agencies’ view of WikiLeaks and its relationship with Russia. “We assess with high confidence that the GRU relayed material it acquired from the DNC and senior Democratic officials to WikiLeaks,” the report said. “Moscow most likely chose WikiLeaks because of its self-proclaimed reputation for authenticity.”

The report noted that none of the files passed to WikiLeaks contained “evident forgeries.”

The document said that “Guccifer 2.0,” the online identity of a hacker purportedly involved in the campaign, “made multiple contradictory statements and false claims about his likely Russian identity throughout the election.” It was the document’s clearest indication that U.S. spy agencies believe they have identified him.

In some ways, Russia’s intervention in the 2016 election is consistent with a long-standing pattern that traces back to the Soviet Union of espionage against prominent politicians and policymakers in the United States. U.S. spy agencies also devote significant resources to gathering intelligence on Putin and his subordinates.

The report suggests that beyond his animosity toward Clinton, Putin may also have been driven by his own conviction that Moscow has been repeatedly targeted with embarrassing leaks that he attributes to the United States, including the Panama Papers files that showed how wealthy individuals close to the Kremlin had hidden their fortunes, as well as material that helped expose the doping scandal among Russia’s Olympic athletes.

Putin’s success in using cyber capabilities and propaganda to disrupt a U.S. presidential race is likely to embolden him to mount similar operations against the United States and its allies in the future. “We assess Russian intelligence services will continue to develop capabilities to provide Putin with options to use against the United States,” the report said.

Karen DeYoung and Julie Tate contributed to this report.

For photos of Trump since the election, go to wapo.st/trumpgallery.

#####EOF##### Jamal Khashoggi’s long road to the doors of the Saudi Consulate - The Washington Post

Jamal Khashoggi’s long road to the doors of the Saudi Consulate

Columnist

The long road that took Jamal Khashoggi to the front door of the Saudi Consulate in Istanbul and the horror that lay inside began in the 1980s in Afghanistan, when he was a passionate young journalist who supported the Saudi establishment — but couldn’t resist criticizing the royal family when he thought it was wrong.

Khashoggi’s path took him through risky territory. He was friendly with Osama bin Laden in his militant youth; his patron in mid-career was Prince Turki al-Faisal, the longtime Saudi intelligence chief; he traveled sometimes to Qatar in the past decade, as a poisonous feud grew between Riyadh and Doha. But his public writings and private messages show that in his head and heart, he was always a Saudi patriot.

Conversations with some of Khashoggi’s close friends, who shared texts they exchanged with him over the years, reveal a man whose greatest passion became journalism itself — which he expressed in a fearless, unblinking commitment to the cleansing power of the truth, regardless of the personal cost.

Khashoggi wondered often along this journey if he should back off, ease up and take fewer risks. But he kept speaking out, knowing the danger. His truth-telling got him fired from prominent editing jobs, rehired and then fired again. At the time of his disappearance, Arab journalism had become a cause he appeared willing to die for.

A portrait of the young Khashoggi comes from Barnett Rubin, senior fellow at New York University’s Center on International Cooperation and one of the United States’ top experts on Afghanistan. They met in 1989 at the U.S. Consulate in Jiddah, when Rubin was on a speaking tour. Khashoggi, then 31, shared a two-part series he had written the year before about his travels with the Arab mujahideen in Afghanistan. One of the photos showed the tall, bearded reporter standing among the Arab fighters, cradling a rocket-propelled grenade launcher in his hands.

Khashoggi couldn’t have traveled with the mujahideen that way without tacit support from Saudi intelligence, which was coordinating aid to the fighters as part of its cooperation with the CIA against the Soviet Union in Afghanistan. But Rubin remembers that during the conversation, Khashoggi criticized Prince Salman, then governor of Riyadh and head of the Saudi committee for support to the Afghan mujahideen, for unwisely funding Salafist extremist groups that were undermining the war.

“It was typical Jamal,” remembers Rubin. “We had just met for the first time and he began complaining” to a near-stranger about mistakes by the royal family. A more careful person would have kept his mouth shut. But that wasn’t Khashoggi.

There’s a fatal symmetry to that 1989 conversation with Rubin: Salman is now king of Saudi Arabia, and his son Mohammed bin Salman is crown prince. Intelligence sources told The Post this week that MBS, as he’s known, plotted to lure and detain Khashoggi, whose outspoken commentary the crown prince feared and hated.

What happened after Khashoggi entered the Saudi Consulate in Istanbul is a macabre mystery. Turkish officials say he was interrogated, killed and hacked into pieces by a 15-man hit squad sent from Riyadh; several U.S. sources speculate that the Saudis might have tried to kidnap Khashoggi back to the kingdom and botched the job. What’s certain is that Khashoggi’s disappearance from the consulate was a flagrant attack on a courageous journalist.


A Turkish police officer providing security enters Saudi Arabia's consulate in Istanbul. (Getty Images)

Khashoggi’s intellectual interests were shaped in his early 20s when he studied in the United States and was also a passionate member of the Muslim Brotherhood. The brotherhood was a secret underground fraternity that wanted to purge the Arab world of the corruption and autocratic rule it saw as a legacy of Western colonialism. Khashoggi was hardly alone in this belief.

The flavor of that period in Khashoggi’s life was captured by Lawrence Wright, a journalist for the New Yorker who met him in Saudi Arabia more than 15 years ago. In his book “The Looming Tower,” Wright quotes Khashoggi about the brotherhood’s appeal: “We were hoping to establish an Islamic state anywhere. We believed that the first one would lead to another, and that would have a domino effect which could reverse the history of mankind.”

Bin Laden joined the brotherhood at about the same time Khashoggi did, in the late 1970s, says Wright. The two men shared a passion for the mujahideen’s war in Afghanistan, first against the Soviet Union and later for power in Kabul. Khashoggi was covering the war as a journalist, but he was clearly sympathetic to the cause.

But by the mid-1990s, Khashoggi’s friends say he had become wary of the extremism of bin Laden and other jihadists. He was moving toward his mature belief that democracy and freedom were the Arabs’ best hope of purging the corruption and misrule he despised.

An important confrontation between Khashoggi and bin Laden came during a 1995 interview in Sudan. Wright recounts how bin Laden bragged about how his terrorism would drive the United States from the Arabian Peninsula. The journalist pressed him to disavow violence inside Saudi Arabia: “Osama, this is very dangerous. It is as if you are declaring war. You will give the right to the Americans to hunt for you.” Bin Laden refused Khashoggi’s efforts to get a statement on the record.

Khashoggi concluded in the 1990s that the Afghanistan civil war was disastrous, as well. Rubin remembers a conversation with him in the mid-1990s in New York. It was snowing, and Rubin was wearing an Afghan hat like the ones worn by mujahideen fighters. “We don’t wear that hat,” Khashoggi chided him. “We call it the hat that destroyed Afghanistan.”


Saudi journalist Jamal Khashoggi (Hasan Jamali/AP)

On Sept. 11, 2001, the world saw the catastrophic effects of al-Qaeda’s extremist ideology. Unlike some Saudis, Khashoggi didn’t try to excuse the fact that Saudis had flown the planes used in the attack. He wrote a column on Sept. 10, 2002, saying that Arabs should recognize that bin Laden had attacked Saudi Arabia and Islam when he struck the twin towers.

Maggie Mitchell Salem, one of Khashoggi’s closest friends, first met him in 2002 when she was working for the Middle East Institute. (She now heads the Qatar Foundation International, an institute in Washington partially funded by Qatar that supports cross-cultural education.) The venue was a conference organized by the Arab Thought Foundation, an organization that was supported by the Faisal family.

Salem remembers an ambitious journalist whose rise was linked with the Faisal clan — Turki and his brother Saud al-Faisal, the longtime Saudi foreign minister. Educated at Georgetown and Princeton, respectively, the Faisal brothers represented the thoughtful, moderate face of the royal family.

Khashoggi was close to these royals, but he was prepared to criticize the monarchy’s clerical establishment, too. In 2003, he became editor of Al Watan, a progressive newspaper owned by the Faisal family. But he lasted less than two months. He was fired, after he published criticism of the Saudi religious leadership.

The Faisal family came to Khashoggi’s rescue. Turki was ambassador to London at the time, and he invited Khashoggi to come work for him there. “To save his life, they got him out of the kingdom,” remembers Salem.

When Turki moved to Washington in 2005 as ambassador, he brought Khashoggi with him as spokesman. It was a difficult time: Prince Bandar bin Sultan, the former ambassador (nicknamed “Bandar Bush” because of his close relationships with Bush 41 and 43), was continuing to visit the White House secretly. That angered Turki; the new ambassador was also trying to trim spending for Saudi Arabia’s public-relations firm in Washington, Qorvis Communications.

Salem recalls talking with Khashoggi about the tension between Turki and Bandar. “He thought that was what royals do to each other,” she recalls. “He felt frustrated for Prince Turki.”

Khashoggi, as Turki’s spokesman, was caught in the crossfire. It was “death by a thousand paper cuts,” for Khashoggi, says Salem.

After Turki was replaced as ambassador in 2007, Khashoggi returned to the kingdom for a second stint as editor of Al Watan. He managed to hold that job until 2010, when he published criticism of Salafist extremism, a problem that had worried him for more than 20 years. He was out in the cold again.

Rubin recalls visiting Khashoggi in Riyadh in 2009, after Rubin had joined the Obama administration as the Afghanistan expert for special envoy Richard Holbrooke. He says that one of his Saudi contacts “warned me not to see Jamal” because “it will make the government suspicious.” Rubin saw him anyway.

The Arab Spring exploded in early 2011, as street protesters in Cairo’s Tahrir Square drove Egyptian President Hosni Mubarak from office. For the Saudi leadership, it was a nightmare, but for Khashoggi, the pro-democracy movement was a dream come true. I remember talking with him in January, just before Mubarak’s fall; he told me that the Arab “renaissance” that had been building for a century was finally happening. In that, as in some other things, his optimism was premature.

Khashoggi’s next patron for his free-thinking journalism was Prince Alwaleed bin Talal, a reform-minded Saudi billionaire. Alwaleed bankrolled a new satellite television channel, called Al-Arab, based in Bahrain, which was seen as a potential rival to Al Jazeera, based in Qatar. The station went on air in 2015, after four years of planning, as the Arab Spring was rocking Bahrain and everywhere else in the region.

On its very first day of broadcasting, Khashoggi’s new TV channel featured an interview with a prominent Bahraini Shiite politician who had criticized the regime. The station didn’t last 24 hours before the Bahraini authorities pulled the plug. Khashoggi was out of a job, yet again.

Salem remembers telling Khashoggi: “ ‘Dude, what did you think was going to happen?’ This was so Jamal. The man put the Shia opposition on his TV channel. He couldn’t let a week go by,” without doing real journalism

The Bahrain episode defined a quality of optimism described by several of Khashoggi’s close friends. Salem explains it this way: “He had an eternal belief that things were good, and that right would win. He had a goodness, and a belief in other people’s goodness.”

The demise of the Arab Spring was a painful experience for Khashoggi, Rubin shared with me a text message Khashoggi sent him in June 2014: “Hello Rubin, As you can see things going from bad to worst in the Arab world. The only hope I think of someone can restore drive for democracy, that positive feeling that spread in Arab world after 2011.”

After the Bahrain TV station was closed, Rubin tried to cheer him up, texting: “I see you are creating problems again. Keep it up.” Khashoggi answered: “I been thinking if it’s time I gave up and retire somewhere safe in the west just to be free and write freely . . . we will never have freedom in the arab world without true democracy.”

Khashoggi was now writing a column for Al Hayat, an Arabic newspaper published in London. In December 2015, he wrote one of the clearest statements of what animated him on his journey: “In the Arab world, everyone thinks journalists cannot be independent, but I represent myself, which is the right thing to do. What would I be worth if I succumbed to pressure to change my opinions?”


Saudi King Salman bin Abdul Aziz (Saudi Royal Palace/AFP/Getty Images)

Saudi Crown Prince Mohammed bin Salman (Alastair Grant/AP)

Khashoggi’s world darkened with the rise of MBS, whose father became king in January 2015. First as deputy crown prince and later as crown prince, MBS used a network of operatives centered on the royal court to consolidate power and, increasingly, suppress dissent.

MBS proclaimed his desire to modernize the kingdom, and introduced some reforms Khashoggi supported, such as allowing women to drive, opening movie theaters and other entertainment, and suppressing religious extremism. At first, Khashoggi tried to be optimistic. Rubin texted him in January 2016, after an early wave of arrests and executions of Shiite clerics and accused militants, and said he worried that “maybe the real target of the executions were people like you.” Khashoggi responded hopefully: “Thanks, my friend, I didn’t read it that way, the gov . . . want to appear tough against extremism not us the people of SA.”

But the arrests and purges continued. MBS staged a soft coup in June 2017, replacing Crown Prince Mohammed bin Nayef. Then in November 2017 came the sweeping arrests of more than 200 Saudis, including many princes, who were held at the Ritz-Carlton in Riyadh.

Khashoggi was staying in London and saying little in public as MBS grew more aggressive. Salem hosted a dinner for Khashoggi on July 4, 2017, at a London restaurant called Clos Maggiore in Covent Garden. He told her that night that he had decided to seek refuge in the United States, and he moved here a few weeks later. He texted Rubin in September about MBS: “This kid is dangerous, I’m under pressure . . . to be ‘wise’ and stay silent. I think I should speak wisely.” But in the end, he couldn’t censor himself.

Khashoggi gained a powerful platform in September 2017 when The Post asked him to be a regular contributor to its Global Opinions online forum. From his very first column, Khashoggi took aim at MBS: “When I speak of the fear, intimidation, arrests and public shaming of intellectuals and religious leaders who dare to speak their minds, and then I tell you that I’m from Saudi Arabia, are you surprised?”

The day that first Post commentary appeared, Khashoggi texted Karen Attiah, his editor for The Post’s Global Opinions section, saying that “it’s so painful for me to publish this piece.” He wrote that he never thought his country would turn toward “intimidation, lies and hate.”


Jamal Khashoggi with his fiancee, Hatice Cengiz. (Courtesy of Hatice Cengiz)

Friends helped Khashoggi obtain a visa that allowed him to stay in the United States as a permanent resident. Rubin wrote a letter to the Department of Homeland Security recalling his first meeting back in 1989 and attesting to his “character, intellect, knowledge, political perspective and work as a journalist.”

In his last messages to Attiah before he disappeared, Khashoggi described his dream of creating a broader platform at The Post for honest news and commentary in Arabic. He didn’t want to be an “exile dissident,” she says, but a journalist. “His eyes lit up” walking around the Post news room, she remembers, and he would say: “I wish we could build this in the Middle East.”

As always, Khashoggi wondered whether he could step back and reduce the danger. When he visited Salem for the last time in August in her office, he told her: “I’m thinking that for two years, I want to go to a faraway island.” He wondered aloud: “Can I just give this up? Can I just not do this anymore?” The answer was always the same: No, he couldn’t give up.

Read more from David Ignatius’s archive, follow him on Twitter or subscribe to his updates on Facebook.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Jamal Khashoggi's columns for The Washington Post - The Washington Post

Read Jamal Khashoggi’s columns for The Washington Post

Jamal Khashoggi, a veteran Saudi journalist, was killed in Istanbul after walking into the consulate of Saudi Arabia, according to Turkish officials. In a statement released Saturday, Fred Hiatt, The Post’s editorial page editor, said that if true, this would represent “a monstrous and unfathomable act.”

Khashoggi had been writing a column for The Post’s Global Opinions section since last year. “He lamented that Saudi Arabia’s repression was becoming unbearable to the point of his decision to leave the country and live in exile in Washington,” wrote Karen Attiah, Khashoggi’s editor, on Wednesday.

Hiatt, in his statement, called Khashoggi a “committed, courageous journalist.”

“He writes out of a sense of love for his country and deep faith in human dignity and freedom,” Hiatt said. “We have been enormously proud to publish his writing.”

Read excerpts from some of Khashoggi’s columns below.

Saudi Arabia wasn’t always this repressive. Now it’s unbearable. – Sept. 18, 2017

When I speak of the fear, intimidation, arrests and public shaming of intellectuals and religious leaders who dare to speak their minds, and then I tell you that I’m from Saudi Arabia, are you surprised?

With young Crown Prince Mohammed bin Salman’s rise to power, he promised an embrace of social and economic reform. He spoke of making our country more open and tolerant and promised that he would address the things that hold back our progress, such as the ban on women driving.

But all I see now is the recent wave of arrests. Last week, about 30 people were reportedly rounded up by authorities, ahead of the crown prince’s ascension to the throne. Some of the arrested are good friends of mine, and the effort represents the public shaming of intellectuals and religious leaders who dare to express opinions contrary to those of my country’s leadership. …

It was painful for me several years ago when several friends were arrested. I said nothing. I didn’t want to lose my job or my freedom. I worried about my family.

I have made a different choice now. I have left my home, my family and my job, and I am raising my voice. To do otherwise would betray those who languish in prison. I can speak when so many cannot. I want you to know that Saudi Arabia has not always been as it is now. We Saudis deserve better. [Read more] [Read in Arabic]

Saudi Arabia’s crown prince wants to ‘crush extremists.’ But he’s punishing the wrong people. – Oct. 31, 2017

Prince Mohammed is right to go after extremists. But he is going after the wrong people. Dozens of Saudi intellectuals, clerics, journalists, and social media stars have been arrested in the past 2 months — the majority of whom, at worst, are mildly critical of the government.  Meanwhile, many members of the Council of Senior Scholars (“Ulema”) have extremist ideas. Sheikh Saleh Al-Fawzan, who is highly regarded by Prince Mohamed, has said on Saudi TV that Shiites are not Muslims. Sheikh Saleh Al-Lohaidan, also highly regarded, has given legal advice that the Muslim ruler is not bound to consult others. Their reactionary opinions about democracy, pluralism or even women driving, are protected by royal decree from counter argument or criticism.

How can we become more moderate when such extremist views are tolerated? How can we progress as a nation when those offering constructive feedback and (often humorous) dissent are banished? [Read more]

Saudi Arabia’s crown prince is acting like Putin – Nov. 5, 2017

Corruption in Saudi Arabia is quite different from  corruption in most other countries, as it is not limited to a “bribe” in return for a contract, or expensive gift for the family member of a government official or prince, or use of a private jet that is charged to the government so a family can go on vacation.

Instead, in Saudi Arabia, senior officials and princes become billionaires as contracts are either enormously inflated or, at worst, a complete mirage. In 2004, Lawrence Wright wrote in the New Yorker about “The Kingdom of Silence” where a massive sewer project in Jeddah was really a series of manhole covers across the city with no actual pipes underneath. I, as the editor of a major paper at the time, can say that we all knew, and we never reported on it. [Read more]

Saudi Arabia is creating a total mess in Lebanon – Nov. 13, 2017

Today, Saudi Arabia alone is the most politically stable and economically secure country in the region. Neither the kingdom nor our conflict-ridden region can afford to see my country lose its footing. MBS’s rash actions are deepening tensions and undermining the security of the Gulf states and the region as a whole. [Read more]

With Ali Abdullah Saleh’s death, Saudi Arabia is paying the price for betraying the Arab Spring – Dec. 5, 2017

The choice of waging even more war is tempting for those in Riyadh who want an overwhelming defeat for the Houthis and to get them out of the political game, but it will be very costly — not only for the kingdom but for the Yemeni people who are already suffering immensely. This conflict is the horrific result of  preventing the people of Yemen from achieving their desire for freedom. Now the Houthi has become a significant force, and they do not hold the values ​​of the Arab Spring based on power sharing. The world is watching Yemen; not only should the Saudis  stop the war, but there should be pressure for the Iranians to stop their support for the Houthis; both sides must accept a Yemeni formula to share power. Perhaps the fall of Saleh the tyrant is a chance for peace in Yemen. [Read more]

Why Saudi Arabia’s crown prince should be worried about Iran’s protests – Jan. 3, 2018

It is still too early to judge how the events in Iran will unfold. If the hard-liners succeed in suppressing the protests, they will continue their expansionist policy, which could mean an escalation of the confrontation with Saudi Arabia. If the regime or [Hassan] Rouhani’s government falls, the chants heard in a number of Iranian cities — “Neither Gaza nor Lebanon, my life will only be sacrificed for Iran” — could become the country’s foreign policy. [Read more] [Read in Arabic]

Saudi Arabia’s crown prince already controlled the nation’s media. Now he’s squeezing it even further. – Feb. 7, 2018

When many of Saudi Arabia’s media tycoons ended up in Riyadh’s Ritz-Carlton along with more than 300 royals, senior officials and wealthy businessmen accused of corruption, many people assumed that the kingdom’s strongman, Crown Prince Mohammed bin Salman, aims to control the media, too.

This is far from true, simply because he already does. [Read more]

What Saudi Arabia’s crown prince can learn from Queen Elizabeth II – Feb. 28, 2018 (co-bylined with Robert Lacey)

MBS’s downsizing and relative humbling of the House of Saud is welcome news. But maybe he should learn from the British royal house that has earned true stature, respect and success by trying a little humility himself. If MBS can listen to his critics and acknowledge that they, too, love their country, he can actually enhance his power. [Read more] [Read in Arabic]

Why Saudi Arabia’s crown prince should visit Detroit – March 20, 2018 (co-bylined with Robert Lacey)

Many inner cities in Saudi Arabia fester today as Detroit once did — they are miserable Third World slums that completely mock the oil riches of the kingdom. So, before MBS ventures into building new cities, perhaps he should deal with the old ones. During his visit to Egypt, which kicked off his current global tour, the crown prince revealed his shared dream with Egyptian President Abdel Fatah al-Sissi of building a prosperous region in northern Saudi Arabia stretching across the Gulf of Aqaba to Egypt — a “Riviera of the Red Sea” to attract millions of tourists yearly. Yet since neither Saudi Arabia nor Egypt has a free press, no one asked the two leaders about Egypt’s numerous tourist destinations, such as Sharm El Sheikh, Hurghada and El Gouna. All have gorgeous beaches on the very same coast and a chronic lack of tourists; they are sad shadows of the resorts they used to be. Surely that problem should be addressed before splashing out precious government funds on still more cities in the sand. [Read more] [Read in Arabic]

By blaming 1979 for Saudi Arabia’s problems, the crown prince is peddling revisionist history – April 3, 2018

In Saudi Arabia at the moment, people simply don’t dare to speak. The country has seen the blacklisting of those who dare raise their voices, the imprisonment of moderately critical intellectuals and religious figures, and the alleged anti-corruption crackdown on royals and other business leaders. Liberals whose work was once censored or banned by Wahhabi hard-liners have turned the tables: They now ban what they see as hard-line, such as the censorship of various books at the Riyadh International Book Fair last month. One may applaud such an about-face. But shouldn’t we aspire to allow the marketplace of ideas to be open?

I agree with MBS that the nation should return to its pre-1979 climate, when the government restricted hard-line Wahhabi traditions. Women today should have the same rights as men. And all citizens should have the right to speak their minds without fear of imprisonment. But replacing old tactics of intolerance with new ways of repression is not the answer. [Read more] [Read in Arabic]

What Saudi Arabia can learn from ‘Black Panther’ — April 17, 2018

This Wednesday, Disney’s blockbuster “Black Panther” will be shown in theaters in Saudi Arabia, officially ending a decades-long ban on movie theaters in the country. This may seem odd to Americans who have grown up with cinema and popcorn, but to many Saudis it’s a huge step toward normalization. For too long, hard-line religious figures have preached that cinema would bring about the collapse of all moral values. When the Saudi Crown Prince Mohammed bin Salman decided to end the ban, he also effectively stopped the preachers from repeating such foolishness. By taking the lead to remove the ban, he proved that the government has the final say when it comes to deciding what’s permissible or not, and that some things should be left up to the personal choice of citizens, not the clergy. …

At the end of the film, the young king of Wakanda chooses to use his country’s power to engage with the world for the greater good. Will Crown Prince Mohammed bin Salman, who likely will soon become king of his country, use his power to bring peace to the world around him? [Read more] [Read in Arabic]

 

Saudi Arabia’s reformers now face a terrible choice – May 21, 2018

It is appalling to see 60- and 70-year-old icons of reform being  branded as “traitors” on the front pages of Saudi newspapers.

Women and men who championed many of the same social freedoms — including women driving — that Crown Prince Mohammed bin Salman is now advancing were arrested in Saudi Arabia last week.  The crackdown has shocked even the government’s most stalwart defenders.

The arrests illuminate the predicament confronting all Saudis. We are being asked to abandon any hope of political freedom, and to keep quiet about arrests and travel bans that impact not only the critics but also their families. We are expected to vigorously applaud social reforms and heap praise on the crown prince while avoiding any reference to the pioneering Saudis who dared to address these issues decades ago. …

The message is clear to all: Activism of any sort has to be within the government, and no independent voice or counter-opinion will be allowed. Everyone must stick to the party line.

Is there no other way for us?  Must we choose between movie theaters and our rights as citizens to speak out, whether in support of or critical of our government’s actions?  Do we only voice glowing references to our leader’s decisions, his vision of our future, in exchange for the right to live and travel freely — for ourselves and our wives, husbands and children too? I have been told that I need to accept, with gratitude, the social reforms that I have long called for while keeping silent on other matters — ranging from the Yemen quagmire, hastily executed economic reforms, the blockade of Qatar, discussions about an alliance with Israel to counter Iran, and last year’s imprisonment of dozens of Saudi intellectuals and clerics.

This is the choice I’ve woken up to each morning ever since last June, when I left Saudi Arabia for the last time after being silenced by the government for six months. [Read more] [Read in Arabic]

Saudi Arabia’s women can finally drive. But the crown prince needs to do much more. – June 25, 2018

Crown Prince Mohammed bin Salman deserves consider credit for bringing the matter to a close the right way. While previous leaders were reluctant to take up the issue, he faced it head-on and did the right thing for Saudi Arabia. At the same time, I hope he will not forget the brave actions of each and every Saudi who individually worked hard for freedom and modernization. He should order the release of Hathloul, Aziza al-Yousef, Eman al-Nafjan and the other brave women who campaigned for women’s right to drive. They should be allowed to finally witness the results of their tears and toil. [Read more]

Saudi Arabia’s crown prince must restore dignity to his country — by ending Yemen’s cruel war – Sept. 11, 2018

The longer this cruel war lasts in Yemen, the more permanent the damage will be. The people of Yemen will be busy fighting poverty, cholera and water scarcity and rebuilding their country. The crown prince must bring an end to the violence and restore the dignity of the birthplace of Islam. [Read more] [Read in Arabic]

Read more:

Turkish president calls Jamal Khashoggi’s disappearance ‘very, very upsetting’

Where is Jamal Khashoggi?

This should be a column by Jamal Khashoggi

Turkey concludes Saudi journalist Jamal Khashoggi killed by ‘murder’ team, sources say

The silencing of Jamal Khashoggi

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Coast Guard is ‘approaching a tipping point’: Commandant sounds alarm about aging fleet - The Washington Post

Coast Guard is ‘approaching a tipping point’: Commandant sounds alarm about aging fleet

Adm. Karl Schultz delivered a ‘State of the Coast Guard’ address in Los Angeles


Adm. Karl Schultz, the Coast Guard commandant, delivers the 2019 State of the Coast Guard Address at Coast Guard Base Los Angeles-Long Beach on March 21, 2019. (Seaman Ryan Estrada / Coast Guard)

LOS ANGELES — The Coast Guard is “approaching a tipping point” when it comes to maintaining a service that can respond in times of need, the service’s top officer said Thursday, requesting more money from Congress and highlighting a backlog of projects that has stacked up for years.

Adm. Karl Schultz, the Coast Guard commandant, said that he is thankful for an increase in funding the service received in 2019 to build new vessels but that the service still has an operations budget that has “essentially been flatlined” over the past eight years.

“In a modestly funded organization like the Coast Guard, this has resulted in deferred maintenance, a strained and undersized workforce and antiquated information systems,” Schultz said. “And we continue to face an extensive shore infrastructure backlog that now exceeds $1.7 billion. That’s particularly problematic for an organization with facilities spread far and wide across the nation.”

Schultz, speaking on a lawn overlooking the port of Los Angeles, requested a 5 percent increase in the service’s 2020 operations budget, up to $7.9 billion, and an overall budget of $11.34 billion, up from $10.6 billion. The service has long had a reputation for keeping aging equipment working, and Schultz said it faces “very real readiness challenges” that must be dealt with.

The comments came during a State of the Coast Guard speech, nine months after Schultz took over as the service’s top officer.

In that time frame, he has been forced to deal with a government shutdown, caused by disagreements over President Trump’s proposed southern border wall, that left service members without pay for weeks and most civilian employees furloughed. The Department of Homeland Security, which oversees the Coast Guard, has also reprogrammed millions of dollars in Coast Guard funding to assist Immigration and Customs Enforcement amid a crackdown on illegal immigration by the Trump administration.

Schultz did not mention the reprogramming in his speech but addressed it in an interview Tuesday as he flew from Washington to California. Money was reprogrammed, but the service is part of the “DHS team” and has received strong support from Homeland Security Secretary Kirstjen Nielsen, he said.

“Clearly, I don’t think there is a better investment of a federal taxpayer dollar than the Coast Guard,” he said. “We’re a modestly funded organization, and we do good things with those dollars. But that said, if the department needs to reprogram money, that is a part of doing business.”

The admiral also downplayed the effects of the shutdown, which internal Coast Guard documents first reported on by the New York Times said caused “tremendous backlogs of contractor work” in the service and a “domino effect” in delayed maintenance that could leave the service short of working aircraft.

Schultz indicated that the problems were not as severe as characterized. The service should be “pretty much reconstituted” by May, he said, and is about 75 percent of the way back now.

“You know, there are facts and there are emotions around facts,” Schultz said. “As we got our arms around it, I would tell you we will be full-up ready as we would have been without the shutdown as a Coast Guard by the summer, if not sooner.”

Schultz expressed gratitude for new vessels the service is fielding, including the 418-foot national security cutter that is now considered the centerpiece of the Coast Guard fleet and new 360-foot offshore patrol cutters that will replace vessels that are in some cases 50 years old.

Each national security cutter will be equipped with a ScanEagle unmanned aircraft, something the service already is experimenting with in cocaine interdiction operations in the eastern Pacific, Schultz said.

Schultz also underscored the Coast Guard’s desire for new polar icebreakers. The nation’s one working heavy icebreaker, the USCGC Polar Star, recently returned to its home port in Seattle after a voyage to Antarctica that included crew members fighting a fire in an incinerator for 90 minutes and repairing a broken shaft that had allowed water into the ship. It is 43 years old and due for $15 million in upgrades to make it last an additional five years.

In an interview, Schultz said that spending money to renovate the aged ship “is not the best investment of dollars, but is necessary considering no replacement is ready. He has advocated a strategy in which six new icebreakers, including three heavy ones, are purchased but acknowledged there may be some wiggle room as long the Coast Guard receives the three heavy ones.

“There are other folks who say: ‘Hey maybe you don’t need six of these. Maybe it’s four of these or five of these.’ There might be some negotiating space,” he said. “It’s the three polar security cutters that is really first and foremost in my mind."

Schultz also highlighted efforts to make the service more welcoming to women and minorities, citing a study on retention that the service will unveil next week in which hundreds of female service members were interviewed. A similar study of minorities will begin soon.

The plans for change include revising weight standards that disproportionately affect women, easing the service's tattoo policy and removing some restrictions that were in place on single parents, the admiral said.

“These actions are the first steps in a dedicated campaign to identify barriers to inclusion, and to help frame solutions that challenge the status quo,” he said. “They are small ripples that will lead to a groundswell of cultural change.”

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read National
Read content from allstate
Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
We went to the source. Here’s what matters to millennials.
A state-by-state look at where Generation Y stands on the big issues.
#####EOF##### Soldier’s posthumous Medal of Honor highlights the Pentagon’s struggles to fully recognize valor in combat - The Washington Post

Soldier’s posthumous Medal of Honor highlights the Pentagon’s struggles to fully recognize valor in combat

Army Staff Sgt. Travis Atkins was recognized at the White House on Wednesday.


President Trump presents the Medal of Honor to the family of Army Staff Sgt. Travis Atkins, who saved the lives of three other soldiers in the Iraq War. (Jabin Botsford/The Washington Post)

Army Sgt. Sand Aijo was in the gun turret of a Humvee in 2007 when he and his fellow soldiers rolled up on two suspicious men in Iraq’s “Triangle of Death.” They were in a place U.S. soldiers didn’t expect to find them, and so glassy-eyed and fidgety that Aijo charged his machine gun, he recalled.

Staff Sgt. Travis Atkins, their gruff but revered squad leader, stepped out of the Humvee and walked toward the first stranger. Then an Army medic stepped out of the back seat, moving toward the second.

As Aijo tried to keep track of both soldiers, Atkins unexpectedly began grappling with the first Iraqi just a few feet away. Atkins grabbed him in a bear hug, slammed him to the ground and pinned him down.

“The thing that became confusing was that once they hit the ground, the way that Travis began positioning his body, it just seemed strange to me,” Aijo said. “That’s when the detonation happened.”

On Wednesday, Atkins, of Bozeman, Mont., posthumously became the fifth U.S. service member to receive the nation’s highest award for combat valor, the Medal of Honor, for actions during the Iraq War.

Atkins’s son, Trevor Oliver, accepted the award on behalf of his late father from President Trump, who highlighted how Atkins, then 31, died June 1, 2007, saving the lives of the three other soldiers by choosing to smother a suicide vest with his own body.

“In his final moments on earth, Travis did not run. He didn’t know what it was to run," Trump said. “He laid down his life to save the lives of his fellow warriors.”

The case highlights the Pentagon’s longtime struggles to fully recognize some of the U.S. military’s most highly regarded modern-day heroes — and underscores the likelihood that the Pentagon may soon belatedly award other service members the nation’s highest combat decoration.

To date, no living service member or veteran has received the Medal of Honor for actions in Iraq. Seventeen Americans have been awarded Medals of Honor for actions in Afghanistan, including four posthumous awards.

Doug Sterner, an Army veteran and historian who has testified before Congress on valor issues, said Wednesday that he is aware of at least one case in which a living Army veteran will soon be awarded the Medal of Honor for actions in Iraq. Sterner said he could not disclose whom, and Army officials declined to comment.


Staff Sgt. Travis Atkins, second to right. (Photo released by the U.S. Army)

Atkins’s award is the latest to surface since defense secretary Ash Carter launched a review in 2016 after years of U.S. troops and some members of Congress voicing frustration over how few recipients came from modern conflicts.

The Pentagon set out to review more than 1,300 cases in which U.S. troops had received the nation’s second- and third-highest valor awards to make sure the recipients were not worthy of a more prestigious medal.

In Atkins’s case, his battalion commander in the 10th Mountain Division, now-retired Army Col. John Valledor, nominated him for the Medal of Honor. The Army downgraded the award to the Distinguished Service Cross, the service’s second-highest award, and presented it to his family in 2008.

Valledor said Tuesday he was “pretty satisfied” when Atkins received the Distinguished Service Cross. But he acknowledged being surprised the higher award was not approved. He nominated Atkins for the Medal of Honor after researching earlier cases in which recipients had smothered grenades, he said, and concluded that the only difference was that in Atkins’s case, “it was a living grenade.”

“I had a lengthy discussion with my chain of command, and I think the consensus was that we were too close to it,” he said. “That we were too emotionally tied to the narrative.”

Similar stories linger.

In August, Trump posthumously awarded Air Force Tech Sgt. John Chapman the Medal of Honor for his actions in March 2002 on a snowy Afghan mountaintop. Chapman, 36, received the Air Force Cross, his service’s second-highest award, in 2003 for fighting to his death and fending off the ambush of a helicopter filled with Army Rangers, but the Pentagon determined he deserved the higher decoration.

Last May, Trump also awarded Navy Command Master Chief Britt Slabinski, 49, the Medal of Honor for valor in the same battle in which Chapman was killed. The Navy SEAL had received the Navy Cross, but the medal was upgraded after the Pentagon’s review.

Potentially unresolved cases include that of Army Sgt. 1st Class Alwyn Cashe, 35. He posthumously received the Silver Star after pulling six wounded soldiers from a burning Bradley Fighting Vehicle in a fuel-soaked uniform in Iraq on Oct. 17, 2005, suffering burns over more than 70 percent of his body.

His battalion commander at the time, now-Maj. Gen. Gary Brito, told the Los Angeles Times in 2014 that he wishes he had submitted Cashe for the Medal of Honor.

“If Cashe doesn’t get a Medal of Honor, I’m just going to be totally disappointed," Sterner said. “It’s the most striking example of a Medal of Honor that I have ever accounted.”

The dearth of modern Medals of Honor has been attributed to the inexperience U.S. commanders had with recommending and processing the award early in the Iraq and Afghanistan wars. The United States had not been in a major conflict in years, and few Vietnam veterans remained in the ranks.

Dwight Mears, a retired Army officer and historian who published a book about the Medal of Honor, said that there was “a cultural problem with the military not knowing what the appropriate gallantry thresholds were."

“I think it is largely resolved at this point, but there was some naivete early in those conflicts,” he said.

U.S. military officials said Wednesday that the Pentagon also has approved recent upgrades for 12 soldiers to receive Distinguished Service Crosses, three Marines and 12 sailors to receive Navy Crosses, and five airmen to receive Air Force Crosses. The medals were upgraded from the Silver Star, the third-highest valor award.

The Marine Corps also upgraded nine additional awards to Silver Star, and the Navy upgraded 18. The Air Force upgraded four additional awards to Silver Star and two to Distinguished Flying Cross with V device.

Members of the Atkins family told reporters Tuesday that they were appreciative of the Distinguished Service Cross and did not believe that Atkins’s award would be elevated when the White House reached out to them.

In fact, Oliver and Atkins’s father, Jack, said with a chuckle that they initially thought the calls from Washington were part of a scam. In reality, it was administration staff members trying to connect them with Trump.

“I thought there was some elaborate plan going on and they were just trying to fool me. I immediately was not very nice to people on the phone, and I was being rather rude,” said Oliver, who was 11 when his father died. “My girlfriend was in the room, and she said my jaw was on the floor and I was beet red. It was a liberating experience. It’s such an incredible, incredible honor.”

Aijo said he was “speechless” when he found out about the upgrade for his former mentor.

“You don’t think about things like this that often, so it brought back a lot of emotion for me,” he said. “Once I had time to kind of settle and bring back my thoughts, I was extremely overjoyed. It was nice to know that a grateful nation would be equally thankful for this sacrifice as I was.”

This story was originally published Wednesday morning and updated after the ceremony.

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read National
Read content from allstate
Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
We went to the source. Here’s what matters to millennials.
A state-by-state look at where Generation Y stands on the big issues.
#####EOF##### Here’s why NSA officials never seem to stop talking about 9/11 - The Washington Post

Here’s why NSA officials never seem to stop talking about 9/11


General Keith Alexander (L), director of the National Security Agency (NSA) testifies at a House Intelligence Committee hearing on Capitol Hill in Washington October 29, 2013. Top U.S. intelligence officials appeared at a congressional hearing on Tuesday amid a public uproar that has expanded from anger over the National Security Agency collecting the phone and email records of Americans to spying on European allies. Director of National Intelligence James Clapper watches on at right. (REUTERS/Jason Reed )

NSA talking points prepared to respond to the wave of leaks about surveillance practices advised officials to cite 9/11 to justify programs, according to a document obtained by Al Jazeera America via a Freedom of Information request.

“I much prefer to be here today explaining these programs, than explaining another 9/11 event that we were not able to prevent,” was among the suggested responses, as was "NSA and its partners must make sure we connect the dots so that the nation is never attacked again like it was on 9/11."

And it appears officials have taken that advice to heart: Sept. 11 or 9/11 was mentioned 14 times during a House Intelligence Committee hearing about the leaks Tuesday -- five of them from NSA Director Gen. Keith B. Alexander. In one of his early mentions he gave the specific death count for the terrorist attack when explaining the origin of the programs: "How did we end up here? 9/11 -- 2,996 people were killed in 9/11."

Both representatives of the intelligence community and congressional advocates for surveillance programs invoked 9/11 to argue that such a tragedy might not have happened if they had access to the programs revealed by former NSA contractor Edward Snowden in place before the attack.  "Prior to 9/11, we had no way of connecting those dots," argued Alexander. But now, he says, the intelligence agency has "programs to do that." Rep. Charles "Dutch" Ruppersberger (D-Md.) used similar language, claiming, "these dots should have and likely could have been connected to prevent 9/11, and are necessary to prevent the next attack. "

It's been 12 years since the attacks of 2001, but the NSA apparently still regards that fateful event as the strongest argument for expanded spying authority.

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### FBI surveillance devices may interfere with 911 calls, U.S. senator says - The Washington Post

FBI surveillance devices may interfere with 911 calls, U.S. senator says


Sen. Ron Wyden (D-Ore.) asked the Justice Department to be more forthcoming about the potentially disruptive nature of cell tower simulators -- also known as IMSI Catchers or Stingrays (Al Drago/Bloomberg)

Cellphone tracking devices commonly used by the FBI and other federal and local law-enforcement agencies have the potential to disrupt emergency 911 communications, a U.S. senator said this week, raising new concerns about whether the devices are a threat to individuals' personal safety.

In a Tuesday letter addressed to Attorney Gen. Jeff Sessions, Sen. Ron Wyden (D-Ore.) asked the Department of Justice to be more forthcoming about the potentially disruptive nature of cell tower simulators -- also known as IMSI Catchers or Stingrays -- which law enforcement agencies and others use to covertly track suspects’ movements through their cellphones.

Citing conversations with unnamed executives from Harris Corporation, a Florida-based government contractor that makes a widely used cell tower simulator, Wyden wrote that the devices “completely disrupt the communications of targeted phones for as long as the surveillance is ongoing.”

“According to Harris, targeted phones cannot make or receive calls, send or receive text messages, or send or receive data over the Internet,” Wyden wrote.

Harris Corporation director of global public relations Jim Burke declined to answer questions about whether the company’s devices can interfere with U.S. citizens' emergency call services. A Justice Department spokesman said: “we’re aware of the senator’s letter and will be reviewing it.”

Jonathan Mayer, a Princeton computer science professor who served as a technologist at the Federal Communications Commission, said the issue deserves close scrutiny.

“Anything that significantly interferes with 911 is a problem," Mayer said. "And anything that could significantly interfere with 911 deserves a close look.”

The devices work by effectively posing as a cellphone tower and tricking targeted cellphones into connecting with them, giving the user a sense of where the targeted phone is located. They can also be used to eavesdrop or plant malware.

[Here’s how a StingRay works]

Cell tower simulators have worried privacy advocates for years, and their availability online has sparked fears that foreign governments could be using them to conduct espionage in the United States.

The FBI says it primarily uses them to track suspects’ movements in high-stakes cases like drug trafficking and child kidnapping, and does not collect data from the phones themselves. The U.S. government does not have a monopoly on them, however: The devices can be found for sale on Chinese e-commerce sites. And technologists have built their own devices from scratch.

In June the Department of Homeland Security found them deployed near the White House. One mobile security firm claims to have detected them outside major government buildings and embassies across Washington.

The devices' wide-ranging use – and a web of nondisclosure agreements that limit the public’s view of how they work – has been a continuous point of contention for online privacy advocates.

In 2013 a trove of documents released through a Freedom of Information Act lawsuit revealed that federal investigators failed to fully detail the practice to judges authorizing warrants. Wyden’s letter revived those concerns, claiming that Justice Department’s warrant applications understate how much disruption the devices cause to cell services.

The Justice Department “knows far more about cell-site simulators than the courts,” Wyden’s letter reads. “It has an obligation to be candid, forthright and to fully disclose to courts the true impact of this surveillance technology.”

[DHS has detected possible cellphone surveillance in D.C. and doesn’t know who’s doing it]

Stingray devices made by Harris Corporation are supposed to include a feature that would relinquish control when a targeted phone dials emergency services, but that the feature has not been independently tested as part of the company’s Federal Communications Commission’s certification process, according to unnamed Harris Corporation executives cited in Wyden’s letter. The letter similarly asserts that the devices could block emergency services for people who are deaf and hard of hearing by interfering with text-based emergency services.

In a 2017 FBI search warrant request to track the phone of a drug trafficking suspect, for example, an FBI agent wrote that the devices “may interrupt cellular services” of nearby phones, but described that disruption as “brief and temporary." The warrant does not discuss whether such disruption could interfere with 911 calls, something that legal experts said should play into courts' decision on whether a search warrant is “reasonable,” as spelled out in the Fourth Amendment.

“If what [Wyden’s] letter is saying is true about the extent of disruption that Stingrays cause, this is very concerning,” said Stephen Smith, a Stanford Law School professor who served as a magistrate judge from 2004 until 2018. “The ultimate test is whether the warrant is reasonable or not, and it seems that is a very important bit of information to have.”

[Secrecy around police surveillance equipment proves a case’s undoing]

For Harris Corporation, the government’s surveillance efforts are an important source of business. Harris Corporation has sold Stingray devices and related equipment to the FBI, Drug Enforcement Agency and the Immigration and Customs Enforcement among other federal agencies, sales that amounted to tens of millions of dollars over the past decade. A congressional committee staff report found the devices themselves cost $41,500 to $500,000 each.

A 2016 report from the House Oversight Committee said the Justice Department at the time had 310 cell tower simulators and spent more than $71 million on them between 2010 and 2014. Local law-enforcement agencies have used them as well, including metropolitan police in D.C., Alexandria and Baltimore, the report states.

Harris Corporation and the Justice Department maintain strict nondisclosure agreements preventing buyers from publicly discussing the technology, according to documents posted online by the ACLU. Cooper Quintin, a technologist at the advocacy group Electronic Frontier Foundation, said courts have been “kept in the dark” about Stingray technology.

“Harris Corporation might claim that they’re not in fact blocking 911 calls,” Quintin said. “But it’s unknowable because thanks to Harris Corporation’s nondisclosure agreements and their corporate policy of silence, we have very little information about how [Stingrays] work and what implications they have.”

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### Terms of Sale for Print Products - The Washington Post

Terms of Sale for Print Products

Updated January 9, 2017

This Terms of Sale governs the sale of Washington Post Print Products (the “Print Products”).

1. Print Products

The Washington Post Print Products include home delivery of the printed version of The Washington Post newspaper and may include access to some or all of Washington Post Digital Products, such as the website (www.washingtonpost.com), mobile site, and tablet and mobile apps. By using any of the Digital Products, you agree to our Terms of Service and Privacy Policy.

You can view The Post’s various subscription offerings at https://account.washingtonpost.com/acquisition. We also offer gift subscriptions at https://subscribe.washingtonpost.com/gift.

The Post reserves the right to modify the content, type and availability of any Print Products at any time.

2. Subscription

a. Auto-renewing Subscription. Your Print Product subscription, which may start with a promotional rate, will auto-renew at the end of the cycle stated at the time of your order (“Billing Period”) unless and until you cancel your subscription or we terminate it. You can view the date of your next scheduled payment by logging in to your account on our website and clicking on the “My Subscriptions” tab. You will not receive a notice from us that your promotional period has ended or that your subscription has auto-renewed.

b. Differing Subscriptions/Promotions. The Post may offer a number of types of subscriptions, including subscriptions to different Washington Post products and special promotions. Any materially different terms from those described in these Terms of Sale will be disclosed at the time of purchase or in other communications made available to you. You can find specific details regarding your subscription by logging in to your account on our website and clicking on the “My Subscriptions” tab. We reserve the right to change or terminate any offered subscriptions or promotions at any time. All Print Product subscriptions include delivery of our special Thanksgiving Day edition and other special editions as may be designated by The Post (up to 12 per calendar year). You will be charged your then-current Sunday home delivery rate for these editions. Unless otherwise stated, your subscription does not include TV Week. New print subscriptions may be charged a one-time activation fee.

c. Eligibility. Print Product offers are valid only in limited locations within The Washington Post’s 7-day home-delivery area for new subscribers and those who have not been a Washington Post subscriber for the past thirty (30) days. Offers cannot be combined with any other Washington Post reduced-price home-delivery sales offers.

3. Billing

a. Payment Method. You can pay for your subscription with a major credit card (“Payment Method”). Only credit cards are eligible for payment. Do not sign up for a subscription by identifying a debit card in the credit card option. A debit card may also be known as a “check” or “ATM” card and typically has the word “debit” on it. You may edit your Payment Method information by logging in to your account on our website and clicking on the “My Subscriptions” tab. If your payment is unsuccessful by reason of insufficient funds, expiration, or otherwise, you remain responsible for any uncollected amount.

b. Recurring Billing. By placing an order for a subscription, you authorize us to charge you the subscription fee then in effect at the beginning of each Billing Period to your Payment Method. For example, you authorize us to charge your Payment Method the promotional rate disclosed on the subscription screen in the initial Billing Period (if applicable) and the regular subscription rate in subsequent Bill Periods. We automatically bill your Payment Method on the last day of each Billing Period. We reserve the right to change the timing of our billing, in particular, in the event your Payment Method has not successfully settled. If your Payment Method is declined for a recurring payment of your subscription fee, you have ninety (90) days to provide us a new Payment Method or your subscription will be canceled.

You acknowledge that the amount charged each Billing Period may vary for reasons that may include price changes or changing your subscription, and you authorize us to charge your Payment Method for such varying amount each Billing Period. You will not be notified of upcoming charges, except as otherwise provided herein.

c. Price Changes. We reserve the right to change subscription fees for any of our subscriptions at any time. We will notify you of any changes if the regular fee for your subscription changes from what was stated at the time of your initial order. You will have an opportunity to cancel or change your subscription at that time. If you do not cancel or change your subscription, you will be charged the new subscription fee at your next Billing Period.

d. Billing Period. We will charge the subscription fee at the commencement of your subscription or, if applicable, at the end of your free trial period, and automatically on the first calendar day of each Billing Period thereafter unless and until your subscription is cancelled.

e. One-Time Purchases. When you purchase a stand-alone product, such as a gift subscription, we will charge your Payment Method at the time of purchase.

4. Cancellations and Refunds

a. Cancellations. For Print Products, you can cancel your subscription by calling Customer Care at 202-334-6100 and speaking with a representative.

b. Refunds. Payments are non-refundable, and there are no refunds or credits for partially used Billing Periods. We reserve the right, however, to issue refunds or credits at our sole discretion. If we issue a refund or credit in one instance, we are under no obligation to issue the same refund or credit in the future.

5. E-Sign Disclosure and Consent. By purchasing a Print Product subscription and/or clicking on the box at account opening, you consent to receive notices, disclosures, agreements, policies, receipts, confirmations, transaction information, account information, other communications, and changes or updates to any such documents electronically (collectively, the “Electronic Communications”). We will provide these Electronic Communications by posting them on the profile page for your account on the Washington Post website and/or emailing them to your primary email address associated with your Print Product subscription. You agree that the Electronic Communications will satisfy any legal communication requirements, including that such communications be in writing. Electronic Communications will be deemed received by you within 24 hours of the time posted to our website or on the profile page for your account, or within 24 hours of the time emailed to you unless we receive notice that the email was not delivered.

a. System Requirements to Access Information. To receive Electronic Communications, you must have the following equipment and software:

• a computer or other device with an Internet connection;

• a current web browser that includes 128-bit encryption (e.g. Internet Explorer version 6.0 and above, Firefox version 2.0 and above, Chrome version 3.0 and above, or Safari 3.0 and above) with cookies enabled;

• Adobe Acrobat Reader version 8.0 and above to open documents in .pdf format;

• a valid email address (your primary email address associated with the Print Product Subscription); and

• sufficient storage space or other methods (e.g., a USB drive or secure online storage) to save past Electronic Communications or a printer to print them.

Your access to this page verifies that your system/device meets these requirements. You also confirm that you have access to the necessary equipment and are able to receive, open, print, or store Electronic Communications.

It is your responsibility to keep your primary email address up to date. You can change your primary email address by logging in to your account on our website and accessing your profile. You agree that Electronic Communications sent to a primary email address that is incorrect, out of date, blocked by your service provider, or cannot be received due to your failure to maintain the system requirements, will be deemed to have been provided to you. If an Electronic Communication is returned to us because an email your address becomes invalid, we may deem your subscription to be inactive, and you will not receive or have access to your subscription until we receive a valid, working primary email address from you.

We will notify you if there are any material changes to the hardware or software needed to receive Electronic Communications.

b. Paper Delivery of Disclosures and Notices. You have the right to receive a paper copy of the Electronic Communications. To receive a paper copy at no charge, please request it in one of the following ways: (1) go to the Washington Post Help Desk www.washingtonpost.com/contactus and send us a message with your name and email address; or (2) call us at 202-334-6100 and speak to the customer service representative. Any withdrawal of your consent to receive Electronic Communications will be effective only after we have a reasonable period of time to process your withdrawal. You understand and agree that if you withdraw your consent we may – though we are not required to – cancel your Print Product subscription.

6. Changes to the Terms of Sale. We may, from time to time, change these Terms of Sale. When such changes are made, we will make a copy of the new Terms of Sale available to you. You can contact homedelivery@washpost.com if you have any questions about the Terms of Sale.

#####EOF##### Trump won’t allow you to use iPads or laptops on certain airlines. Here’s why. - The Washington Post
Monkey Cage

Trump won’t allow you to use iPads or laptops on certain airlines. Here’s why.

Britain joined the U.S. in creating new restrictions for passengers traveling on flights from airports in several Muslim-majority countries. Here's what you need to know. (Monica Akhtar,Dani Player/The Washington Post)

From Tuesday on, passengers traveling to the U.S. from 10 airports in eight Muslim-majority countries will not be allowed to have iPads, laptops or any communications device larger than a smartphone in the cabin of the plane. If you are traveling from Egypt, Jordan, Kuwait, Morocco, Qatar, Saudi Arabia, Turkey, or the UAE on Egypt Air, Emirates, Etihad Airways, Kuwait Airways, Qatar Airways, Royal Air Maroc, Royal Jordanian Airlines, Saudi Arabian Airlines, or Turkish Airlines, and you want to use your laptop on the flight, you are probably out of luck.

The nations affected by President Trump's executive action on immigration are not actually countries where terrorists who have carried out fatal attacks the United States came from. (Daron Taylor/The Washington Post)

So why is the United States doing this, and how can it get away with it?

The U.S. says it’s all about security

The Trump administration says the new rules were introduced because of intelligence that shows terrorists are continuing to target airlines flying to the United States. An unidentified person familiar with the issue has told The Washington Post that officials have long been worried by a Syrian terrorist group that is trying to build bombs inside electronic devices that are hard to detect.

However, as Demitri Sevastopulo and Robert Wright at the Financial Times suggest, non-U. S. observers are skeptical of this explanation. They note that the United States has not been forthcoming about whether the ban is based on recent intelligence or long-standing concerns. There is also no explanation for why electronic devices in the cabin are a concern, and electronic devices in the baggage hold are not.

There is an alternative explanation

It may not be about security. Three of the airlines that have been targeted for these measures — Emirates, Etihad Airways and Qatar Airways — have long been accused by their U.S. competitors of receiving massive effective subsidies from their governments. These airlines have been quietly worried for months that President Trump was going to retaliate. This may be the retaliation.

These three airlines, as well as the other airlines targeted in the order, are likely to lose a major amount of business from their most lucrative customers — people who travel in business class and first class. Business travelers are disproportionately likely to want to work on the plane — the reason they are prepared to pay business-class or first-class fares is because it allows them to work in comfort. These travelers are unlikely to appreciate having to do all their work on smartphones, or not being able to work at all. The likely result is that many of them will stop flying on Gulf airlines, and start traveling on U.S. airlines instead.

As the Financial Times notes, the order doesn’t affect only the airlines’ direct flights to and from the United States — it attacks the “hub” airports that are at the core of their business models. These airlines not only fly passengers directly from the Gulf region to the United States — they also fly passengers from many other destinations, transferring them from one plane to another in the hubs. This “hub and spoke” approach is a standard economic model for long-haul airlines, offering them large savings. However, it also creates big vulnerabilities. If competitors or unfriendly states can undermine or degrade the hub, they can inflict heavy economic damage.

The United States is weaponizing interdependence

As we have argued in the past, and talk about in forthcoming work, this can be understood as a variant form of “weaponized interdependence.” We live in an interdependent world, where global networks span across countries, creating enormous benefits, but also great disparities of power. As networks grow, they tend to concentrate both influence and vulnerability in a few key locations, creating enormous opportunities for states, regulators and nonstate actors who have leverage over those locations.

In this context, the United States is plausibly leveraging its control over access to U.S. airports, which are central “nodes” in the global network of air travel between different destinations. It is using this control to attack the key vulnerabilities of other networked actors, by going after the central nodes in their networks (the hub airports) and potentially severely damaging them.

There may not be much that Gulf airline carriers can do

Gulf airlines have tried to defend themselves against political attacks from U.S. competitors by appealing to free trade principles. The problem is that standard free trade agreements, such as World Trade Organization rules, don’t really apply to airlines (although they do apply to related sectors, such as the manufacture of airplanes). This has allowed the Gulf airlines to enjoy massive subsidies, without having to worry too much about being sued in the WTO. However, it also makes it hard for Gulf states or the states of other affected airlines to take a WTO case against the new U.S. rules, even if these rules turn out to be motivated by protectionism and the desire to retaliate, rather than real underlying security questions.

If this were happening in a different sector, it would make for a pretty interesting case. States preserve carve-outs from international trade rules when they feel that their security is at stake. Would the United States prevail in a case like this, where there is a colorable security justification, but where there is also a very plausible argument that the real motivation doesn’t have much to do with security? Or would the WTO defer to the United States’ proposed justification? It’s very likely that the Trump administration will make more unilateral rules that are justified using the language of national security, but are plausibly motivated by protectionism, so we may find out.

Most Read Politics
#####EOF##### Subscribe to The Washington Post
#####EOF##### Samantha Josephson funeral: Lawmakers call for ride share change as family mourns - The Washington Post

A student thought she was getting into an Uber. Lawmakers want to prevent another deadly mistake.


Samantha Josephson, a University of South Carolina student, was reported missing after last being seen early Saturday. Later that day, the university confirmed her death. (Columbia Police Department/AP)

Tuesday night, hundreds of mourners gathered in a New Jersey park on the eve of Samantha Josephson’s funeral to say goodbye.

They lit candles and shared memories of their friend, a college student who had been killed in South Carolina last week after police say she mistakenly got into the wrong vehicle, thinking it was an Uber ride. Hours later, she was found dead in a field by hunters, and 24-year-old Nathaniel D. Rowland has been charged in connection with her death.

Her story has gripped the country for days, but the loss was felt most acutely in Robbinsville, N.J., the town where she grew up.

“I am so thankful for the years that we had with each other, and I will miss her each and every day,” a friend, Jess Samuel, told WABC7.

Josephson will be laid to rest Wednesday in Princeton Junction, N.J. Hundreds of family and friends packed a Jewish synagogue in the area for her funeral, according to WABC7.

Josephson’s cousin Seth Josephson spoke to reporters about the pain the family was experiencing.

“The sadness that is being suffered will never end. It may wane in the future, but will always leave a hole in the hearts of [her] fun-loving, generous and kind parents and sister,” he said, according to NJ.com. “Today, they don’t know and can’t contemplate how they will think of the future.”

Her death has increased focus on how to improve the safety of ride-hailing services. One of the loudest voices in the call for change has been Josephson’s father.

“What he did, I don’t want anyone else to go through it as a parent,” Seymour Josephson said Tuesday. “We want something to change.”

The South Carolina state House, on Tuesday, introduced legislation, named in her honor, which if passed would add new safety requirements to ride-hailing vehicles.

The Samantha L. Josephson Ridesharing Safety Act would require drivers to clearly identify their ride-hailing vehicles by displaying illuminated signs when active. The signs must be visible both day and night, and are a step up from South Carolina law, which requires drivers to apply a reflective “signage or emblem” to their vehicle. The legislation also would require drivers to return the illuminated sign when they stop working for a ride-hailing app.

"I’m just sick about this,” state Rep. Seth Rose (D), one of the lawmakers who introduced the bill, told the State.

A representative from Uber, one of the most popular ride-hailing services, told a Columbia NBC affiliate that the company was “devastated” by Josephson’s death.

“Since 2017, we’ve been working with local law enforcement and college campuses across the country to educate the public about how to avoid fake rideshare drivers,” the statement said. “Everyone at Uber is devastated to hear about this unspeakable crime, and our hearts are with Samantha Josephson’s family and loved ones. We remain focused on raising public awareness about this incredibly important issue.”

The University of South Carolina, where Josephson was a senior, sent safety tips to students urging them to “be aware of their surroundings” and to “exercise best practices” when using ride-hailing services such as Uber and Lyft.

Early on Saturday, Josephson said goodbye to her friends after a night at the Bird Dog bar in Columbia, S.C. It was just past 2 a.m. when she called an Uber to take her home.

But the car she got into was not her ride, police said. Surveillance video showed her getting into a black Chevrolet Impala, which had pulled up beside her, police said.

Her friends reported her missing 12 hours later, and not long after that, two hunters found her body in a field about 70 miles from the bar where she was picked up. Autopsy results released Monday said Josephson had died of “multiple sharp force injuries.”

About 24 hours after she disappeared, police spotted a car matching the description of the one Josephson had gotten into. An officer pulled the driver over and asked him to step out of the vehicle, but the man fled. He was later apprehended.

When they inspected the car, authorities discovered antibacterial wipes, bleach, window cleaner, Josephson’s phone and blood that tests later revealed to be hers. The driver, Rowland, was arrested and charged with murder and kidnapping. He will appear in court later this month.

On Tuesday, friends and family gathered in Robbinsville chose to use the vigil to celebrate her life. They shared anecdotes and “Sammyisms,” which told of a girl who loved to sing, laugh and spend time with her friends.

“She would want us to toast her, not cry for her,” said neighbor Barb Samel, according to the Asbury Park Press. “If my family and I learned anything from Sammy it was how to laugh often, love much and to be yourself no matter who was watching.”

Read more:

Four people were found dead in a warehouse this week. Their story is almost a complete mystery.

An unsolved homicide haunted a city for decades. Officials say the killer was there all along.

Most Read Local
#####EOF##### Dan Lamothe - The Washington Post

Dan Lamothe

Washington, D.C.

Reporter covering the Pentagon and the U.S. military Education: University of Massachusetts at Amherst, BA in journalism; University of Maryland at College Park, master's in journalism
 Dan Lamothe covers national security for The Washington Post, with an emphasis on the Pentagon and the U.S. military. He joined The Post in 2014, and has traveled extensively since then on assignment. Lamothe has embedded with U.S. troops in combat in Afghanistan multiple times, and also has reported from the Aleutian Islands, Iraq, the United Arab Emirates, Oman, the Arctic Circle, Norway, Belgium, Germany, France, Singapore, Australia, Mexico, Spain and the Republic of Georgia. 
Honors & Awards:
  • 2018 Colonel Robert D. Heinl Jr. Award, presented by the Marine Corps Heritage Foundation for distinguished feature writing and accompanying podcast about a family of brothers in World War II
  • 2011 Major Megan McClung Award, presented by the Marine Corps Heritage Foundation for distinguished reporting alongside infantry Marines in Afghanistan
  • 2011 Military Reporters and Editors honorable mention, presented for dispatches from the battlefield in Afghanistan
Professional Affiliations: Pentagon Press Association, Military Reporters and Editors Association
Latest from Dan Lamothe

Army Gen. Stephen Townsend said that he has a “pretty significant concern” for how Russia uses private military companies like the Wagner Group in Africa.

  • Apr 2, 2019

The Army infantryman is credited with saving the lives of three fellow soldiers.

  • Mar 27, 2019

The move, announced Monday, was justified under Trump’s declaration of a national emergency at the border.

  • Mar 26, 2019

Adm. Karl Schultz delivered a ‘State of the Coast Guard’ address in Los Angeles.

  • Mar 21, 2019

"Once the rhetoric started with the wall, it definitely picked up in the maritime," said a senior U.S. law enforcement official.

  • Mar 21, 2019

When it comes to aircraft, Boeing has long been the government’s favorite.

  • Mar 18, 2019

Beijing’s recent behavior in the Arctic has triggered some alarms in the Pentagon.

  • Mar 15, 2019

Officials have touted the program as a way to speed up vetting of recruits who have what the Pentagon considers “foreign nexus” risks.

  • Mar 12, 2019

The president’s “cost plus 50” formula has struck fear in the hearts of countries that host American troops.

  • Mar 9, 2019

Send in your questions for a live discussion at 12:30 P.M. ET Tuesday.

  • Nov 27, 2018
Load More
#####EOF##### ‘Internet of Things’ compounded Friday’s hack of major websites - The Washington Post

‘Internet of Things’ compounded Friday’s hack of major websites

What happens when you send an e-mail or buy something online? Most of what we do on the Internet requires sending data thousands of miles to other computers. But how does the data know where to go, and can it get lost or stolen along the way? (Julio C. Negron, Craig Timberg and Jorge Ribas/The Washington Post)

A key part of the Internet's infrastructure was hit by a series of attacks Friday, causing major services such as Twitter, Spotify and PayPal to be inaccessible for many users around the world.

The attacks targeted Dyn, a company that helps people connect to websites, with a huge amount of traffic in an attempt to knock the service offline, the firm said. The incident showed how a digital assault on just one company can disrupt a huge chunk of the Internet.

Dyn chief strategy officer Kyle York said the source of some of the traffic that attacked the company came from compromised "Internet of Things" devices which include everyday items like baby monitors, webcams and even thermostats that can connect to the Internet.

The first cyberattack occurred around 7 a.m. Eastern, and primarily affected users on the East Coast, according to Dyn. It was resolved at roughly 9:20 a.m., the company said. Then a second attack began around 11:50 a.m., and a third attack in the afternoon, according to the company. The later attacks spread further, disrupting access to major sites for users in many different parts of the world, York said.

The attacks disrupted so many sites because New Hampshire-based Dyn is one of a handful of major Domain Name System, or DNS, service providers. DNS works sort of like a phone book for the Internet — translating URLs into the numerical IP addresses for the servers that actually host sites so your browser can connect to them.

On a call with reporters Friday afternoon, the company said they were still responding to the attacks.

"This is hitting our network from tens of millions of IP addresses around the world," York said. The third wave of attacks was resolved around 6 p.m. according to Dyn.

It remains unclear who was behind the attacks.

Issues with Amazon Web Services, a cloud hosting provider relied on by many popular sites, also occurred Friday morning. A status update posted on its website noted disruptions at roughly the same time as the first attack against Dyn.

“The root cause was an availability event that occurred with one of our third party DNS service providers,” the company said, although it did not specifically cite Dyn. (Amazon chief executive Jeff Bezos owns The Washington Post.)

The Department of Homeland Security said it is looking into the issue. "We're aware and are investigating all potential causes," DHS deputy press secretary Gillian Christensen said in an e-mailed statement.

The type of attacks targeting Dyn are commonly known as distributed denial of service, or DDoS attacks.

Last week a DHS cyber defense team warned that new strains of malware are using Internet of Things devices to carry out these attacks. In particular, the group warned about the source code for a variant called "Mirai" being released online.

One of the first major instances of Internet of Things devices being used this way was a record-breaking attack on journalist Brian Krebs's website last month, as Krebs himself reported.

Dyn helped Krebs investigate the attack and recently presented research on the case.

Experts have long warned that many Internet of Things devices are poorly secured -- often due to the speed at which they are brought to market.

"It's important for [Internet of Things] vendors who haven't prioritized security to take this escalating series of attacks as a wake-up call," said Casey Ellis, the founder of crowd-sourcing cybersecurity firm Bugcrowd. "We're entering a period where this is very real, calculable, and painful impact to having insecure products."

DDoS attacks, in general, have become more powerful and more frequent.

A recent report from cloud security provider Akamai said it saw a 129 percent increase in DDoS attacks against its customers in the second quarter of 2016 versus the same period last year.

That combination makes DDoS attacks hard for major sites to withstand, even services like Dyn that have regularly fended them off in the past.

"It's a real challenge," said York.

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### NSA tracking cellphone locations worldwide, Snowden documents show - The Washington Post

NSA tracking cellphone locations worldwide, Snowden documents show

Video: The National Security Agency gathers location data from around the world by tapping into the cables that connect mobile networks globally and that serve U.S. cellphones as well as foreign ones.

The National Security Agency is gathering nearly 5 billion records a day on the whereabouts of cellphones around the world, according to top-secret documents and interviews with U.S. intelligence officials, enabling the agency to track the movements of individuals — and map their relationships — in ways that would have been previously unimaginable.

The records feed a vast database that stores information about the locations of at least hundreds of millions of devices, according to the officials and the documents, which were provided by former NSA contractor Edward Snowden. New projects created to analyze that data have provided the intelligence community with what amounts to a mass surveillance tool.

(Video: How the NSA uses cellphone tracking to find and ‘develop’ targets)

The NSA does not target Americans’ location data by design, but the agency acquires a substantial amount of information on the whereabouts of domestic cellphones “incidentally,” a legal term that connotes a foreseeable but not deliberate result.

One senior collection manager, speaking on the condition of anonymity but with permission from the NSA, said “we are getting vast volumes” of location data from around the world by tapping into the cables that connect mobile networks globally and that serve U.S. cellphones as well as foreign ones. Additionally, data are often collected from the tens of millions of Americans who travel abroad with their cellphones every year.

A look at how the NSA collects cell phone data and uses it to track individual suspects.

In scale, scope and potential impact on privacy, the efforts to collect and analyze location data may be unsurpassed among the NSA surveillance programs that have been disclosed since June. Analysts can find cellphones anywhere in the world, retrace their movements and expose hidden relationships among the people using them.

(Graphic: How the NSA is tracking people right now)

U.S. officials said the programs that collect and analyze location data are lawful and intended strictly to develop intelligence about foreign targets.

Robert Litt, general counsel for the Office of the Director of National Intelligence, which oversees the NSA, said “there is no element of the intelligence community that under any authority is intentionally collecting bulk cellphone location information about cellphones in the United States.”

The NSA has no reason to suspect that the movements of the overwhelming majority of cellphone users would be relevant to national security. Rather, it collects locations in bulk because its most powerful analytic tools — known collectively as CO-TRAVELER — allow it to look for unknown associates of known intelligence targets by tracking people whose movements intersect.

Still, location data, especially when aggregated over time, are widely regarded among privacy advocates as uniquely sensitive. Sophisticated mathematical tech­niques enable NSA analysts to map cellphone owners’ relationships by correlating their patterns of movement over time with thousands or millions of other phone users who cross their paths. Cellphones broadcast their locations even when they are not being used to place a call or send a text message.

(Video: Reporter Ashkan Soltani explains NSA collection of cellphone data)

CO-TRAVELER and related tools require the methodical collection and storage of location data on what amounts to a planetary scale. The government is tracking people from afar into confidential business meetings or personal visits to medical facilities, hotel rooms, private homes and other traditionally protected spaces.

“One of the key components of location data, and why it’s so sensitive, is that the laws of physics don’t let you keep it private,” said Chris Soghoian, principal technologist at the American Civil Liberties Union. People who value their privacy can encrypt their e-mails and disguise their online identities, but “the only way to hide your location is to disconnect from our modern communication system and live in a cave.”

The NSA cannot know in advance which tiny fraction of 1 percent of the records it may need, so it collects and keeps as many as it can — 27 terabytes, by one account, or more than double the text content of the Library of Congress’s print collection.

The location programs have brought in such volumes of information, according to a May 2012 internal NSA briefing, that they are “outpacing our ability to ingest, process and store” data. In the ensuing year and a half, the NSA has been transitioning to a processing system that provided it with greater capacity.

The possibility that the intelligence community has been collecting location data, particularly of Americans, has long concerned privacy advocates and some lawmakers. Three Democratic senators — Ron Wyden (Ore.), Mark Udall (Colo.) and Barbara A. Mikulski (Md.) — have introduced an amendment to the 2014 defense spending bill that would require U.S. intelligence agencies to say whether they have ever collected or made plans to collect location data for “a large number of United States persons with no known connection to suspicious activity.”

NSA Director Keith B. Alexander disclosed in Senate testimony in October that the NSA had run a pilot project in 2010 and 2011 to collect “samples” of U.S. cellphone location data. The data collected were never available for intelligence analysis purposes, and the project was discontinued because it had no “operational value,” he said.

Alexander allowed that a broader collection of such data “may be something that is a future requirement for the country, but it is not right now.”

The number of Americans whose locations are tracked as part of the NSA’s collection of data overseas is impossible to determine from the Snowden documents alone, and senior intelligence officials declined to offer an estimate.

“It’s awkward for us to try to provide any specific numbers,” one intelligence official said in a telephone interview. An NSA spokeswoman who took part in the call cut in to say the agency has no way to calculate such a figure.

An intelligence lawyer, speaking with his agency’s permission, said location data are obtained by methods “tuned to be looking outside the United States,” a formulation he repeated three times. When U.S. cellphone data are collected, he said, the data are not covered by the Fourth Amendment, which protects Americans against unreasonable searches and seizures.

According to top-secret briefing slides, the NSA pulls in location data around the world from 10 major “sigads,” or signals intelligence activity designators.

A sigad known as STORMBREW, for example, relies on two unnamed corporate partners described only as ARTIFICE and WOLFPOINT. According to an NSA site inventory, the companies administer the NSA’s “physical systems,” or interception equipment, and “NSA asks nicely for tasking/updates.”

STORMBREW collects data from 27 telephone links known as OPC/DPC pairs, which refer to originating and destination points and which typically transfer traffic from one provider’s internal network to another’s. That data include cell tower identifiers, which can be used to locate a phone’s location.

The agency’s access to carriers’ networks appears to be vast.

“Many shared databases, such as those used for roaming, are available in their complete form to any carrier who requires access to any part of it,” said Matt Blaze, an associate professor of computer and information science at the University of Pennsylvania. “This ‘flat’ trust model means that a surprisingly large number of entities have access to data about customers that they never actually do business with, and an intelligence agency — hostile or friendly — can get ‘one-stop shopping’ to an expansive range of subscriber data just by compromising a few carriers.”

Some documents in the Snowden archive suggest that acquisition of U.S. location data is routine enough to be cited as an example in training materials. In an October 2012 white paper on analytic techniques, for example, the NSA’s counterterrorism analysis unit describes the challenges of tracking customers who use two different mobile networks, saying it would be hard to correlate a user on the T-Mobile network with one on Verizon. Asked about that, a U.S. intelligence official said the example was poorly chosen and did not represent the program’s foreign focus. There is no evidence that either company cooperates with the NSA, and both declined to comment.

The NSA’s capabilities to track location are staggering, based on the Snowden documents, and indicate that the agency is able to render most efforts at communications security effectively futile.

Like encryption and anonymity tools online, which are used by dissidents, journalists and terrorists alike, security-minded behavior — using disposable cellphones and switching them on only long enough to make brief calls — marks a user for special scrutiny. CO-TRAVELER takes note, for example, when a new telephone connects to a cell tower soon after another nearby device is used for the last time.

Side-by-side security efforts — when nearby devices power off and on together over time — “assist in determining whether co-travelers are associated . . . through behaviorally relevant relationships,” according to the 24-page white paper, which was developed by the NSA in partnership with the National Geospatial-Intelligence Agency, the Australian Signals Directorate and private contractors.

A central feature of each of these tools is that they do not rely on knowing a particular target in advance, or even suspecting one. They operate on the full universe of data in the NSA’s FASCIA repository, which stores trillions of metadata records, of which a large but unknown fraction include locations.

The most basic analytic tools map the date, time, and location of cellphones to look for patterns or significant moments of overlap. Other tools compute speed and trajectory for large numbers of mobile devices, overlaying the electronic data on transportation maps to compute the likely travel time and determine which devices might have intersected.

To solve the problem of un­detectable surveillance against CIA officers stationed overseas, one contractor designed an analytic model that would carefully record the case officer’s path and look for other mobile devices in steady proximity.

“Results have not been validated by operational analysts,” the report said.

Julie Tate contributed to this report. Soltani is an independent security researcher and consultant.

#####EOF##### Under pressure to digitize everything, hospitals are hackers’ biggest new target - The Washington Post

Under pressure to digitize everything, hospitals are hackers’ biggest new target


A sign designates an entrance to the MedStar Georgetown University Hospital in Washington. (AP Photo/Molly Riley)

The cyberattack on MedStar Health — one of the biggest health-care systems in the Washington region — is a foreboding sign that an industry racing to digitize patient records and services faces a new kind of security threat that it is ill-prepared to handle, security experts and hospital officials say.

For years, hospitals and the health care industry have been focused on keeping patient data from falling into the wrong hands. But the recent attacks at MedStar and other hospitals across the country highlight an even more frightening downside of security breaches: As hospitals have become dependent on electronic systems to coordinate care, communicate critical health data and avoid medication errors, patients’ well-being may also be at stake when hackers strike.

Hospitals are used to chasing the latest medical innovations, but they are rapidly learning that caring for sick people also means protecting their medical records and technology systems against hackers. An industry that has traditionally spent a small fraction of its budget on cyberdefense is finding it must also teach doctors and nurses not to click on suspicious links and shore up its technical systems against hackers armed with an ever-evolving set of tools.

In some ways, health care is an easy target: Its security systems tend to be less mature than those of other industries, such as banking and tech, and its doctors and nurses depend on data to perform time-sensitive, life-saving work. Where a financial-services firm might spend a third of its budget on information technology, hospitals spend only about 2 to 3 percent, said John Halamka, the chief information officer of Beth Israel Deaconess Medical Center in Boston.

“If you’re a hacker... would you go to Fidelity or an underfunded hospital?” Halamka said. “You’re going to go where the money is and the safe is easiest to open.”

The stakes are almost uniquely high. Hospitals’ electronic systems are often in place to help prevent errors. Without computer systems, pharmacists can’t easily review patients' lab results, look up what other medications the patients are on or figure out what allergies they might have before dispensing medications. And nurses administering drugs can’t scan the medicines and the patients' wristbands as a last check that they’re giving the correct treatments. When lab results exist only on a piece of paper in a patient’s file, it’s possible they could be accidentally removed by a busy doctor or nurse -- and critical information could simply disappear.

A virus attacked the computer network of MedStar Health early on March 28, forcing the medical network to shut down its online database. The FBI is investigating the breach, which comes weeks after similar cyberattacks on other healthcare providers. (WUSA9)

In MedStar’s case, a virus early this week infiltrated its computer systems, forcing the health-care giant to shut down its entire network, turn away patients, postpone surgeries and resort to paper records.

“One thing I think is becoming clear, especially over the last few weeks or months, is that health care is rapidly becoming a target for this,” said Daniel Nigrin, chief information officer of Boston Children’s Hospital, whose network came under attack by the hacker collective Anonymous in April 2014. “What struck us at that point was, you know what? These attacks can do a lot more than get your data; they can really disrupt the day-to-day operations of your facilities.”

Although a handful of hospitals nationwide have been the victims of cyberattacks in recent weeks, the MedStar security breach shows hackers’ increasing boldness and sophistication. The chain is one of biggest employers in the Baltimore-Washington region and runs ten hospitals as well as 250 clinics and other sites. MedStar spokeswoman Ann Nickels declined to elaborate on what sort of software attack the hospital suffered, but several employees have said they saw a pop-up message suggesting it was “ransomware” -- a kind of software that can lock people out of systems until they make a bitcoin payment. According to a photo of that message provided by a MedStar Southern Maryland Hospital Center employee, the hackers were demanding 45 bitcoins — equivalent to about $19,000 -- to restore access to MedStar’s system.

“You just have 10 days to send us the Bitcoin,” the note read. “After 10 days we will remove your private key and it’s impossible to recover your files.”

Nickels said Medstar saw “no indication that data has left our system” or that patient privacy had be compromised. In a statement, the health-care system said that it had not paid any type of ransom. In a Friday-afternoon update, the hospital said that MedStar was “approaching 90 percent functionality” of its systems.

Ransomware is not new, but cybersecurity experts and FBI data say its use is on the rise. Hospitals, of course, are not the only institutions facing such attacks. In a nine-month period in 2014, the FBI received 1,838 complaints about ransomware, and it estimates that victims lost more than $23.7 million. The next year, the bureau received 2,453 complaints, and victims lost $24.1 million. The FBI does not condone paying ransom, but its agents acknowledge that businesses are often left with a tough choice.

And hospitals, in particular, are vulnerable. In the weeks before the attack on MedStar, hackers hit Hollywood Presbyterian Medical Center in Los Angeles, extorting $17,000 in bitcoin out of the leadership, and Kentucky-based Methodist Hospital, which declared a state of emergency after an attack. Two southern California hospitals, part of Prime Healthcare Services, were attacked in March.

Justin Harvey, the chief security officer of Fidelis Cybersecurity, said the hackers’ success is likely to make them bolder, and he worries about critical infrastructure in the United States.

“I can’t comment on whether the FAA and all the power grids are up to snuff,” he said. “If they’re not, it can create a big problem.”

Craig Williams, security outreach manager at Talos, the cybersecurity research group of Cisco, said that the use of ransomware has exploded because it has good profit margins. He estimated it as a $100 million a year business.

“The malware industry is making giant steps toward ransomware, and really, the reason behind this is ransomware’s profit margin simply exceeds that of other types of criminal activity,” Williams said.

The way hackers get into a system is generally through a phishing attack – persuading an unsuspecting employee to click on a link or an attachment in an email – or by finding a network vulnerability.

That leaves hospitals with two challenges: designing systems that can resist attack and training employees.

On the network side, Williams said that health-care companies – or any companies -- that do not have full-time security specialists may not be keeping up with the latest problems and patches. He noted that one strain of ransomware exploits a well-known vulnerability in networks, and when his team did a scan of the Internet this week, they found 2.1 million servers that would be susceptible to such an attack.

The cultural problem may be even harder to solve.

“You’re as vulnerable as your most gullible employee,” Halamka said.

At Beth Israel, the hospital has printed up stickers that appear on salads and cookies in the cafeteria, so that people are reminded, even when eating lunch, not to click on links in emails they didn’t expect to receive. The hospital has also conducted its own internal phishing campaigns – fake emails that they send to their employees to see whether they need to do extra training and assess where the risks exist.

Experts said the current attacks seem to be based in Eastern Europe, although it is hard to tell whether one group alone is responsible. The hacks have similarities, to be sure, but hackers trade tools and information. One concern is that as the attacks gain more news coverage, they will inspire more copycats who will use the same technique to target other vulnerable networks.

“This thing is an industry, the black market that does this type of activity,” said Chris Ensey, the chief operating officer at Dunbar Security Solutions.

The details about MedStar’s particular case – including what particular version of ransomware might have been used and how it got into the system – remain murky. An FBI spokesman declined to provide any details – including on the type of possible ransomware – other than to say the bureau is “aware of the incident and is looking into the nature and scope of the matter.”

Staff writer John Woodrow Cox contributed to this report.

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/03/2019
#####EOF##### Ad choices - The Washington Post

Ad choices


Published September 16, 2016

Online advertising for washingtonpost.com is delivered by the vendor DoubleClick. DoubleClick places cookies on your browser to facilitate serving particular ads – for instance, to help determine whether you have seen a particular advertisement before, to tailor ads to you if you have visited our site before, and to avoid sending you duplicate advertisements. You can opt out of DoubleClick’s use of cookies for these purposes by visiting http://www.google.com/intl/en/privacy/.

In some cases, we and advertisers on washingtonpost.com and other sites work with other third-party vendors to help deliver advertisements tailored to your interests. These vendors include ad networks and audience segment providers, and they use  cookies, beacons, mobile ad identifiers, and similar technologies to collect information about your online activity (e.g., the sites and pages you have visited and mobile apps you use) in order to help advertisers deliver particular ads on our site and other sites that they believe you would find most relevant. You can opt out of those vendors’ use of cookies to tailor advertising to you by visiting http://www.aboutads.info/.

Due to differences between using apps and websites on mobile devices, you may need to take additional steps to disable tracking technologies in mobile apps.  Many mobile devices allow you to opt-out of targeted advertising for mobile apps using the settings within the mobile app or your mobile device.  For more information, please check your mobile settings.  You also may uninstall our apps using the standard uninstall process available on your mobile device or app marketplace.

Often our advertisers contract with a third-party service to host their ads. In this case, an ad serving vendor contacts the advertisers’ hosting service for a particular advertisement. In that case, an independent cookie may be used by the third-party service. We do not have a mechanism to allow visitors to opt-out of cookies from vendors with whom we do not have a contractual relationship.

***

Related: Privacy Policy and Terms of Service

#####EOF##### Indiana elementary school takes uneaten lunches and gives them to the needy - The Washington Post

Food at an elementary school was going to waste. Now, it goes home with needy children.


A volunteer at Cultivate alongside the packaged meals that are sent to students in Elkhart, Ind., and other charity organizations that the nonprofit serves.

Most kids look forward to the weekend. But for some students at Woodland Elementary School in Elkhart, Ind., it’s not always a happy time.

“A lot of them are food insecure,” said Natalie Bickel, the supervisor of student services at Elkhart Community Schools. “They know they’re not going to have a breakfast and a lunch.”

Angel Null, whose two children attend the school, said the family had recently fallen on hard times. Her husband had been laid off from his job in the RV industry shortly after she became a stay-at-home mom last fall.

“It’s been a struggle as a mom,” she told The Washington Post. “There’s times where its been just peanut butter and jelly."

But this weekend, her son, 8 and daughter, 6, came home with backpacks filled with frozen meals. They could choose from French toast and red velvet macadamia nut pancakes for breakfast. There were drumsticks and hot dogs for lunch.

Null’s children were part of a group of 20 students participating in a pilot program at Woodland Elementary. A food-rescue nonprofit organization called Cultivate has stepped in to re-purpose leftover cafeteria food into frozen meals that needy students can take home over the weekend.

“There’s a peace of mind to know there’s something in the fridge,” Null said.

In a county where nearly 13 percent of children ages 5 to 17 in families live in poverty, according to census data, Bickel said many students rely on free or reduced-price school lunches. But outside of school, their nutritional options could be limited.

The new pilot programseeks to fill that gap.

Bickel worked with Cultivate and social workers at Woodland Elementary to identify students for the program that launched on March 29 and has so far garnered enthusiasm, Bickel said. The students receive backpacks that double as coolers and are given eight frozen meals to enjoy over the weekend.

Cultivate takes leftovers to its facilities, where a small staff and group of volunteers compile them into meals that include a protein, a vegetable, and a starch. They’re packaged in recyclable containers and frozen to maintain freshness, then placed in backpacks that are distributed by school officials to students in the program.

When the program was announced to Woodland Elementary’s cafeteria workers, they stood up and applauded, Bickel said. “It’s something they deal with every day,” she said. “They see the need, they see the hungry kids, and to throw [extra food] away was really difficult for them.”


A student at Woodland Elementary School in Elkhart, Ind., carries a backpack filled with meals prepared by Cultivate.

Cultivate Culinary School and Catering was founded in 2016 by Jim Conklin and a local chef, Randy Ziolkowski, after a previous restaurant enterprise fell through. Dismayed by the amount of food going to waste and wanting to help their community, they founded the nonprofit to take food that would otherwise be thrown out by caterers and event spaces and repurpose it into healthy meals for those in need.

Cultivate had already begun piloting a school-lunch program similar to Woodland’s at the nearby Madison STEAM Academy in South Bend, Ind., Conklin told The Post. The group provides weekend meals for 100 students at that school, and the food comes from donors such as the University of Notre Dame, a partner, as well as local event spaces and catering services.

“Our goal is to feed hungry kids, and we want to see improved school performance, whether it’s academic, behavior, or attendance,” Conklin said. “This backpack program was close to our hearts.”

Bickel was part of a local leadership academy run by the Elkhart Chamber of Commerce. Last October, the program’s director of business development, Melissa Ramey, brought in nonprofits for the academy participants to work with, and one of them was Cultivate. Excited by the idea of working with them, Bickel put Cultivate in touch with officials at Woodland Elementary. After deciding the best course of action was the backpack lunch program, they worked with the health department and food workers at the school to implement the best approach.

Ramey was proud of the program that had resulted from the leadership academy’s collaboration and hoped that providing regular meals would help children focus on their academics.

“When they go home on the weekends and maybe they get one meal a day, whatever their parents are working hard to provide, I hope this is able to supplement that,” she said. “I hope to hear that these meals create a better outcome for these students."

Read More:

A woman was beaten by a man with a gun. Police charged her with damaging the attacker’s truck.

Komodo Island is shutting down because people keep smuggling the dang dragons

‘I cut people,’ a megachurch pastor threatened as she preached. Her target? The local newspaper.

A student thought she was getting into an Uber. Lawmakers want to prevent another deadly mistake.

Most Read Local
#####EOF##### Why everyone is left less secure when the NSA doesn’t help fix security flaws - The Washington Post

Why everyone is left less secure when the NSA doesn’t help fix security flaws


Gen. Michael Hayden, former CIA Director and former NSA Director speaks at the Washington Post Cybersecurity Summit , in Washington, D.C.on Oct. 3, 2013 . (Photo by Jeffrey MacMillan )

In a frank discussion about the government's approach to vulnerabilities in cyber-infrastructure during a Washington Post Live summit Thursday, former NSA chief Michael Hayden said the agency is not always "ethically or legally compelled" to help fix flaws it knows about. If the agency thinks that no one else will be able to exploit a vulnerability, it leaves the problem unfixed to aid in its own spying efforts. That approach might be convenient for the NSA, but it needlessly endangers the security of Americans' computers.

The statement came after an audience member asked if backdoors reported in the NSA leaks introduced vulnerabilities that could be exploited by hackers. Craig Mundie, a Senior Adviser to the CEO at Microsoft, took a first crack at the question. He asserted that Microsoft does not engineer in any backdoors nor has there ever been any effort to "facilitate" those kind of things. However, he also noted he could not speak to government capabilities and added "any [backdoor] mechanism that anybody would put into something obviously creates another class of vulnerabilities."

"Nobody but us"

Hayden argued the concept of vulnerabilities was not unique to the Internet and had been an issue the NSA has dealt with since its founding. "There's a reason that America's offensive and defensive squads are up at Fort Meade," Hayden said, explaining "because both offense and defense at this world hinges on a question of vulnerability." Hayden then laid out the concept of NOBUS, which stands for "nobody but us," that he termed "very useful" for making macro-judgments about how to react to vulnerabilities, regardless of if those flaws are "preexistent, not designed, mistake, intended, implanted, [or] whatever":

You look at a vulnerability through a different lens if even with the vulnerability it requires substantial computational power or substantial other attributes and you have to make the judgment who else can do this? If there's a vulnerability here that weakens encryption but you still need four acres of Cray computers in the basement in order to work it you kind of think "NOBUS" and that's a vulnerability we are not ethically or legally compelled to try to patch -- it's one that ethically and legally we could try to exploit in order to keep Americans safe from others.

You can watch the full exchange in the video embedded below.

To a certain extent, this NOBUS idea reflects the weighing of the dual defensive and offensive mission of the NSA. Sure, patching vulnerabilities might effectively make infrastructure safer on a broad scale. But we're talking about the same agency that reportedly has a 600-some elite offensive hacker squad, Tailored Access Operations or TAO, working out of its headquarters. And NOBUS also raises a lot of questions about how the intelligence agency determines if something is likely to be exploited by adversaries.

Zero-day exploits

Take the NSA's connection to the zero-day market. Earlier this year a Freedom of Information Act (FOIA) request revealed that the agency had a significant contract with with Vupen, a French company that deals with zero-day vulnerabilities -- security flaws not yet discovered or patched by vendors. Sometimes these zero-days are used to exploit systems by the hackers who discover them, sometimes vendors are told about them as part of bug bounty programs, and sometimes they end up in these digital gray markets.

The United States is a major player in these gray markets, although other nations are reported to be also in on the game. A Reuters's special report from May claimed the United States was the biggest buyer of exploits from this market, with defense contractors and government agencies spending "at least tens of millions of dollars a year just on exploits." But by their very nature, these exploits would seem to fail the NOBUS test, says Christopher Soghoian, Principal Technologist and Senior Policy Analyst at the ACLU's Speech, Privacy and Technology Project.

"The NSA does not have a monopoly over the exploits that it buys, whether from the black market or from defense contractors. Those same vulnerabilities can and will be discovered by other researchers too, some of whom may sell them to other governments and criminals," Soghoian said.

And while from a defensive perspective, it makes sense for intelligence agencies to scour these marketplaces and try to buy exploits out of the market, it doesn't seem like that's how it always works. Reuters spoke to two former White House cybersecurity advisers, Howard Schmidt and Richard Clarke, who thought the government was putting too much focus on offensive capabilities at the expense of business and consumer security. "If the U.S. government knows of a vulnerability that can be exploited, under normal circumstances, its first obligation is to tell U.S. users," Clarke said, adding "[t]here is supposed to be some mechanism for deciding how they use the information, for offense or defense. But there isn't."

Developing offensive cyber capabilities 

Sometimes purchased exploits appear to be making it into government designed malware. For instance, the Stuxnet worm that targeted Iranian uranium facilities is widely believed to have been a joint American-Israeli development -- and a security researcher told the Economist at least one of the four exploits it relies on was bought rather than engineered in-house.

Stuxnet also illustrates how the deployment of offensive cybertools could be bad for consumer and business IT security. Stuxnet managed to make it into the digital wild pretty quickly, infecting other industrial systems and companies. And that's not all. “Some of the zero-days used in Stuxnet were later exploited by criminals," said Soghoian. "Had the NSA provided information about the vulnerabilities to Microsoft, the company could have distributed patches, and those criminals never would have been able to exploit those vulnerabilities.”

“This is just one of many scenarios where offense and defense conflict," when it comes to cybersecurity, Soghoian said. "For the NSA to have offensive abilities they must leave the public vulnerable. When you buy a new computer, you don't have to tell the salesman if you are a terrorist, or a drug dealer. We all use the same computers and software. What this means is that for the NSA to have the capability to hack into the computer of a terrorist, they need to have the capability to hack into everyone else's computer too. They're prioritizing offense over defense, that's really what it comes down to.”

But while TAO is reportedly America's national digital offense, we aren't the only ones playing that game. Earlier this year, a report from cybersecurity firm Mandiant suggested that the Chinese military was behind a large cyber-espionage ring, and hackers who are believed to have ties to the Iranian government have successfully managed to access the control software for oil pipes and breached Navy computer networks. The growing profile of these other well-supported adversaries might make the case stronger for a focus on making the digital battlefield more secure, not less.

An NSA spokesman declined to comment on Hayden's comments, but defended the NSA's track record on cybersecurity, saying, “NSA’s Information Assurance Directorate sets the security requirements to protect our government’s national security systems, shares our understanding of vulnerabilities with the private sector, and advocates for the best vulnerability mitigations. We continue to partner with federal organizations, private industry, and academia.”

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### Washington Post: Breaking News, World, US, DC News & Analysis - The Washington Post
“This request is about policy, not politics,” the Ways and Means Committee chairman said. The president has said he does not plan to hand over his tax returns to Congress — and that he would fight it to the Supreme Court, according to two administration officials.
The president’s son-in-law, described in a House committee document as “Official 1,” had “significant disqualifying factors,” according to a White House whistleblower.
A Coast Guard craft patrols a waterway in front of Mar-a-Lago resort. (AP)
A Coast Guard craft patrols a waterway in front of Mar-a-Lago resort. (AP)
An entranceway to the West Palm Beach, Fla., resort. (Getty Images)
An entranceway to the West Palm Beach, Fla., resort. (Getty Images)
A motorcade carrying the president departs Mar-a-Lago. (AP)
A motorcade carrying the president departs Mar-a-Lago. (AP)
The FBI is looking at why a Chinese national illegally gained access to Mar-a-Lago last weekend.
The new accounts emerged weeks before former vice president Joe Biden is expected to announce his decision about a White House bid. They reflected a feeling among some women that he was struggling to understand why his behavior might at times be inappropriate or unwelcome.
Play the latest episode of Post Reports, the premier daily podcast from The Washington Post.
The alleged multimillion-dollar bribery scheme is rare and widely shunned. But that doesn’t necessarily preclude other underhanded tactics, including attempts to sabotage students who are competing for coveted spaces at the most selective schools.
As the Consumer Product Safety Commission’s acting chairwoman awaits confirmation to continue serving beyond this year, two Democrats seek information on the role of the Trump appointee in agency’s probe into BOB jogging strollers.
Fact Checker
Analysis
The president’s statement that Mexico only this week started to detain thousands of Central Americans at its southern border is nonsense.
The move to ease the confirmation of President Trump’s nominees came after Senate debate exposed raw emotions delivered in highly personal terms.
The exchange was the first time Labor Secretary Alexander Acosta, a former U.S. attorney, has been publicly questioned about the deal since a federal judge ruled that it had violated the law.
Advocates say walkways are too narrow for the 36 million annual visitors, forcing them off the paths to tread on the roots of the cherry blossom trees whose beauty they come to celebrate. Meanwhile, the entire basin is slowly sinking.
“I felt that I needed to love this child and keep her safe,” Liz Smith, a nurse in Massachusetts, said of the girl, who had been a ward of the state since she was 3 months old.
  • 21 hours ago
(Allie Caren, Blair Guild/The Washington Post)
How do product safety recalls work?
How do product safety recalls work?
Play Video 2:07
'Sellout, turncoat, clown.' Nats fans have a message for Bryce Harper.
Play Video 2:08
How Trump and his administration talk about Puerto Rico
Play Video 2:52
Introducing Chicken and Waffles cereal, plus other flavors you never asked for
Play Video 4:10
Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
From Our Advertisers
This content is paid for by the advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
#####EOF##### U.S., British intelligence mining data from nine U.S. Internet companies in broad secret program - The Washington Post

U.S., British intelligence mining data from nine U.S. Internet companies in broad secret program

The U.S. goverment is accessing top Internet companies’ servers to track foreign targets. Reporter Barton Gellman talks about the source who revealed this top-secret information and how he believes his whistleblowing was worth whatever consequences are ahead. (Brook Silva-Braga/The Washington Post)

The National Security Agency and the FBI are tapping directly into the central servers of nine leading U.S. Internet companies, extracting audio and video chats, photographs, e-mails, documents, and connection logs that enable analysts to track foreign targets, according to a top-secret document obtained by The Washington Post.

The program, code-named PRISM, has not been made public until now. It may be the first of its kind. The NSA prides itself on stealing secrets and breaking codes, and it is accustomed to corporate partnerships that help it divert data traffic or sidestep barriers. But there has never been a Google or Facebook before, and it is unlikely that there are richer troves of valuable intelligence than the ones in Silicon Valley.

Equally unusual is the way the NSA extracts what it wants, according to the document: “Collection directly from the servers of these U.S. Service Providers: Microsoft, Yahoo, Google, Facebook, PalTalk, AOL, Skype, YouTube, Apple.”

London’s Guardian newspaper reported Friday that GCHQ, Britain’s equivalent of the NSA, also has been secretly gathering intelligence from the same internet companies through an operation set up by the NSA.

According to documents obtained by The Guardian, PRISM would appear to allow GCHQ to circumvent the formal legal process required in Britain to seek personal material such as emails, photos and videos from an internet company based outside of the country.

PRISM was launched from the ashes of President George W. Bush’s secret program of warrantless domestic surveillance in 2007, after news media disclosures, lawsuits and the Foreign Intelligence Surveillance Court forced the president to look for new authority.

Congress obliged with the Protect America Act in 2007 and the FISA Amendments Act of 2008, which immunized private companies that cooperated voluntarily with U.S. intelligence collection. PRISM recruited its first partner, Microsoft, and began six years of rapidly growing data collection beneath the surface of a roiling national debate on surveillance and privacy. Late last year, when critics in Congress sought changes in the FISA Amendments Act, the only lawmakers who knew about PRISM were bound by oaths of office to hold their tongues.

The court-approved program is focused on foreign communications traffic, which often flows through U.S. servers even when sent from one overseas location to another. Between 2004 and 2007, Bush administration lawyers persuaded federal FISA judges to issue surveillance orders in a fundamentally new form. Until then the government had to show probable cause that a particular “target” and “facility” were both connected to terrorism or espionage.

In four new orders, which remain classified, the court defined massive data sets as “facilities” and agreed to certify periodically that the government had reasonable procedures in place to minimize collection of “U.S. persons” data without a warrant.

In a statement issue late Thursday, Director of National Intelligence James R. Clapper said “information collected under this program is among the most important and valuable foreign intelligence information we collect, and is used to protect our nation from a wide variety of threats. The unauthorized disclosure of information about this important and entirely legal program is reprehensible and risks important protections for the security of Americans.”

Clapper added that there were numerous inaccuracies in reports about PRISM by The Post and the Guardian newspaper, but he did not specify any.

Jameel Jaffer, deputy legal director of the American Civil Liberties Union, said: “I would just push back on the idea that the court has signed off on it, so why worry? This is a court that meets in secret, allows only the government to appear before it, and publishes almost none of its opinions. It has never been an effective check on government.”

Several companies contacted by The Post said they had no knowledge of the program, did not allow direct government access to their servers and asserted that they responded only to targeted requests for information.

“We do not provide any government organization with direct access to Facebook servers,” said Joe Sullivan, chief security officer for Facebook. “When Facebook is asked for data or information about specific individuals, we carefully scrutinize any such request for compliance with all applicable laws, and provide information only to the extent required by law.”

“We have never heard of PRISM,” said Steve Dowling, a spokesman for Apple. “We do not provide any government agency with direct access to our servers, and any government agency requesting customer data must get a court order.”

It is possible that the conflict between the PRISM slides and the company spokesmen is the result of imprecision on the part of the NSA author. In another classified report obtained by The Post, the arrangement is described as allowing “collection managers [to send] content tasking instructions directly to equipment installed at company-controlled locations,” rather than directly to company servers.

Government officials and the document itself made clear that the NSA regarded the identities of its private partners as PRISM’s most sensitive secret, fearing that the companies would withdraw from the program if exposed. “98 percent of PRISM production is based on Yahoo, Google and Microsoft; we need to make sure we don’t harm these sources,” the briefing’s author wrote in his speaker’s notes.

An internal presentation of 41 briefing slides on PRISM, dated April 2013 and intended for senior analysts in the NSA’s Signals Intelligence Directorate, described the new tool as the most prolific contributor to the President’s Daily Brief, which cited PRISM data in 1,477 items last year. According to the slides and other supporting materials obtained by The Post, “NSA reporting increasingly relies on PRISM” as its leading source of raw material, accounting for nearly 1 in 7 intelligence reports.

That is a remarkable figure in an agency that measures annual intake in the trillions of communications. It is all the more striking because the NSA, whose lawful mission is foreign intelligence, is reaching deep inside the machinery of American companies that host hundreds of millions of American-held accounts on American soil.

The technology companies, whose cooperation is essential to PRISM operations, include most of the dominant global players of Silicon Valley, according to the document. They are listed on a roster that bears their logos in order of entry into the program: “Microsoft, Yahoo, Google, Facebook, PalTalk, AOL, Skype, YouTube, Apple.” PalTalk, although much smaller, has hosted traffic of substantial intelligence interest during the Arab Spring and in the ongoing Syrian civil war.

Dropbox, the cloud storage and synchronization service, is described as “coming soon.”

Sens. Ron Wyden (D-Ore.) and Mark Udall (D-Colo.), who had classified knowledge of the program as members of the Senate Intelligence Committee, were unable to speak of it when they warned in a Dec. 27, 2012, floor debate that the FISA Amendments Act had what both of them called a “back-door search loophole” for the content of innocent Americans who were swept up in a search for someone else.

“As it is written, there is nothing to prohibit the intelligence community from searching through a pile of communications, which may have been incidentally or accidentally been collected without a warrant, to deliberately search for the phone calls or e-mails of specific Americans,” Udall said.

Wyden repeatedly asked the NSA to estimate the number of Americans whose communications had been incidentally collected, and the agency’s director, Lt. Gen. Keith B. Alexander, insisted there was no way to find out. Eventually Inspector General I. Charles McCullough III wrote Wyden a letter stating that it would violate the privacy of Americans in NSA data banks to try to estimate their number.

Roots in the ’70s

PRISM is an heir, in one sense, to a history of intelligence alliances with as many as 100 trusted U.S. companies since the 1970s. The NSA calls these Special Source Operations, and PRISM falls under that rubric.

The Silicon Valley operation works alongside a parallel program, code-named BLARNEY, that gathers up “metadata” — technical information about communications traffic and network devices — as it streams past choke points along the backbone of the Internet. BLARNEY’s top-secret program summary, set down in the slides alongside a cartoon insignia of a shamrock and a leprechaun hat, describes it as “an ongoing collection program that leverages IC [intelligence community] and commercial partnerships to gain access and exploit foreign intelligence obtained from global networks.”

But the PRISM program appears to more nearly resemble the most controversial of the warrantless surveillance orders issued by President George W. Bush after the al-Qaeda attacks of Sept. 11, 2001. Its history, in which President Obama presided over exponential growth in a program that candidate Obama criticized, shows how fundamentally surveillance law and practice have shifted away from individual suspicion in favor of systematic, mass collection techniques.

The Obama administration points to ongoing safeguards in the form of “extensive procedures, specifically approved by the court, to ensure that only non-U.S. persons outside the U.S. are targeted, and that minimize the acquisition, retention and dissemination of incidentally acquired information about U.S. persons.”

And it is true that the PRISM program is not a dragnet, exactly. From inside a company’s data stream the NSA is capable of pulling out anything it likes, but under current rules the agency does not try to collect it all.

Analysts who use the system from a Web portal at Fort Meade, Md., key in “selectors,” or search terms, that are designed to produce at least 51 percent confidence in a target’s “foreignness.” That is not a very stringent test. Training materials obtained by The Post instruct new analysts to make quarterly reports of any accidental collection of U.S. content, but add that “it’s nothing to worry about.”

Even when the system works just as advertised, with no American singled out for targeting, the NSA routinely collects a great deal of American content. That is described as “incidental,” and it is inherent in contact chaining, one of the basic tools of the trade. To collect on a suspected spy or foreign terrorist means, at minimum, that everyone in the suspect’s inbox or outbox is swept in. Intelligence analysts are typically taught to chain through contacts two “hops” out from their target, which increases “incidental collection” exponentially. The same math explains the aphorism, from the John Guare play, that no one is more than “six degrees of separation” from any other person.

A ‘directive’

In exchange for immunity from lawsuits, companies such as Yahoo and AOL are obliged to accept a “directive” from the attorney general and the director of national intelligence to open their servers to the FBI’s Data Intercept Technology Unit, which handles liaison to U.S. companies from the NSA. In 2008, Congress gave the Justice Department authority for a secret order from the Foreign Surveillance Intelligence Court to compel a reluctant company “to comply.”

In practice, there is room for a company to maneuver, delay or resist. When a clandestine intelligence program meets a highly regulated industry, said a lawyer with experience in bridging the gaps, neither side wants to risk a public fight. The engineering problems are so immense, in systems of such complexity and frequent change, that the FBI and NSA would be hard pressed to build in back doors without active help from each company.

Apple demonstrated that resistance is possible when it held out for more than five years, for reasons unknown, after Microsoft became PRISM’s first corporate partner in May 2007. Twitter, which has cultivated a reputation for aggressive defense of its users’ privacy, is still conspicuous by its absence from the list of “private sector partners.”

Google, like the other companies, denied that it permitted direct government access to its servers.

“Google cares deeply about the security of our users’ data,” a company spokesman said. “We disclose user data to government in accordance with the law, and we review all such requests carefully. From time to time, people allege that we have created a government ‘back door’ into our systems, but Google does not have a ‘back door’ for the government to access private user data.”

Microsoft also provided a statement: “We provide customer data only when we receive a legally binding order or subpoena to do so, and never on a voluntary basis. In addition we only ever comply with orders for requests about specific accounts or identifiers. If the government has a broader voluntary national security program to gather customer data we don’t participate in it.”

Yahoo also issued a denial.

“Yahoo! takes users’ privacy very seriously,” the company said in a statement. “We do not provide the government with direct access to our servers, systems, or network.”

Like market researchers, but with far more privileged access, collection managers in the NSA’s Special Source Operations group, which oversees the PRISM program, are drawn to the wealth of information about their subjects in online accounts. For much the same reason, civil libertarians and some ordinary users may be troubled by the menu available to analysts who hold the required clearances to “task” the PRISM system.

There has been “continued exponential growth in tasking to Facebook and Skype,” according to the PRISM slides. With a few clicks and an affirmation that the subject is believed to be engaged in terrorism, espionage or nuclear proliferation, an analyst obtains full access to Facebook’s “extensive search and surveillance capabilities against the variety of online social networking services.”

According to a separate “User’s Guide for PRISM Skype Collection,” that service can be monitored for audio when one end of the call is a conventional telephone and for any combination of “audio, video, chat, and file transfers” when Skype users connect by computer alone. Google’s offerings include Gmail, voice and video chat, Google Drive files, photo libraries, and live surveillance of search terms.

Firsthand experience with these systems, and horror at their capabilities, is what drove a career intelligence officer to provide PowerPoint slides about PRISM and supporting materials to The Washington Post in order to expose what he believes to be a gross intrusion on privacy. “They quite literally can watch your ideas form as you type,” the officer said.

Poitras is a documentary filmmaker and MacArthur Fellow. Julie Tate, Robert O’Harrow Jr., Cecilia Kang and Ellen Nakashima contributed to this report.

#####EOF##### How to fight mass surveillance even though Congress just reauthorized it - The Washington Post
PostEverything

How to fight mass surveillance even though Congress just reauthorized it

What the battle looks like after Section 702's reauthorization


The National Security Agency is collecting more data about you than you might like. (NSA via Reuters)
Bruce Schneier is a security technologist and a lecturer at the Kennedy School of Government at Harvard University. His new book, "Click Here to Kill Everybody," will be published in September.

For over a decade, civil libertarians have been fighting government mass surveillance of innocent Americans over the Internet. We’ve just lost an important battle. On Jan. 18, when President Trump signed the renewal of Section 702, domestic mass surveillance became effectively a permanent part of U.S. law.

Section 702 was initially passed in 2008, as an amendment to the Foreign Intelligence Surveillance Act of 1978. As the title of that law says, it was billed as a way for the National Security Agency to spy on non-Americans located outside the United States. It was supposed to be an efficiency and cost-saving measure: The NSA was already permitted to tap communications cables located outside the country, and it was already permitted to tap communications cables from one foreign country to another that passed through the United States. Section 702 allowed it to tap those cables from inside the United States, where it was easier. It also allowed the NSA to request surveillance data directly from Internet companies under a program called PRISM.

The problem is that this authority also gave the NSA the ability to collect foreign communications and data in a way that inherently and intentionally also swept up Americans’ communications as well, without a warrant. Other law enforcement agencies are allowed to ask the NSA to search those communications, give their contents to the FBI and other agencies and then lie about their origins in court.

In 1978, after Watergate had revealed the Nixon administration’s abuses of power, we erected a wall between intelligence and law enforcement that prevented precisely this kind of sharing of surveillance data under any authority less restrictive than the Fourth Amendment. Weakening that wall is incredibly dangerous, and the NSA should never have been given this authority in the first place.

Arguably, it never was. The NSA had been doing this types of surveillance illegally for years, something that was first made public in 2006. Section 702 was secretly used as a way to paper over that illegal collection, but nothing in the text of the later amendment gives the NSA this authority. We didn’t know that the NSA was using this law as the statutory basis for this surveillance until Edward Snowden showed us in 2013.

Civil libertarians have been battling this law in both Congress and the courts ever since it was proposed, and the NSA’s domestic surveillance activities even longer. What this most recent vote tells me is that we’ve lost the fight.

Section 702 was passed under George W. Bush in 2008, reauthorized under Barack Obama in 2012, and now reauthorized again under Trump. In all three cases, congressional support was bipartisan. It has survived multiple lawsuits by the Electronic Frontier Foundation, the ACLU and others. It has survived the revelations by Snowden that it was being used far more extensively than Congress or the public believed, and numerous public reports of violations of the law. It has even survived Trump’s belief that he was being personally spied on by the intelligence community, as well as any congressional fears that Trump could abuse the authority in the coming years. And though this extension lasts only six years, it’s inconceivable to me that it will ever be repealed at this point.

So what do we do? If we can’t fight this particular statutory authority, where’s the new front on surveillance? There are, it turns out, reasonable modifications that target surveillance more generally, and not in terms of any particular statutory authority. We need to look at U.S. surveillance law more generally.

First, we need to strengthen the minimization procedures to limit incidental collection. Since the Internet was developed, all the world’s communications travel around in a single global network. It’s impossible to collect only foreign communications, because they’re invariably mixed in with domestic communications. This is called “incidental” collection, but that’s a misleading name. It’s collected knowingly, and searched regularly. The intelligence community needs much stronger restrictions on which American communications channels it can access without a court order, and rules that require they delete the data if they inadvertently collect it. More importantly, “collection” is defined as the point the NSA takes a copy of the communications, and not later when they search their databases.

Second, we need to limit how other law enforcement agencies can use incidentally collected information. Today, those agencies can query a database of incidental collection on Americans. The NSA can legally pass information to those other agencies. This has to stop. Data collected by the NSA under its foreign surveillance authority should not be used as a vehicle for domestic surveillance.

The most recent reauthorization modified this lightly, forcing the FBI to obtain a court order when querying the 702 data for a criminal investigation. There are still exceptions and loopholes, though.

Third, we need to end what’s called “parallel construction.” Today, when a law enforcement agency uses evidence found in this NSA database to arrest someone, it doesn’t have to disclose that fact in court. It can reconstruct the evidence in some other manner once it knows about it, and then pretend it learned of it that way. This right to lie to the judge and the defense is corrosive to liberty, and it must end.

Pressure to reform the NSA will probably first come from Europe. Already, European Union courts have pointed to warrantless NSA surveillance as a reason to keep Europeans’ data out of U.S. hands. Right now, there is a fragile agreement between the E.U. and the United States — called “Privacy Shield” — that requires Americans to maintain certain safeguards for international data flows. NSA surveillance goes against that, and it’s only a matter of time before E.U. courts start ruling this way. That’ll have significant effects on both government and corporate surveillance of Europeans and, by extension, the entire world.

Further pressure will come from the increased surveillance coming from the Internet of Things. When your home, car and body are awash in sensors, privacy from both governments and corporations will become increasingly important. Sooner or later, society will reach a tipping point where it’s all too much. When that happens, we’re going to see significant pushback against surveillance of all kinds. That’s when we’ll get new laws that revise all government authorities in this area: a clean sweep for a new world, one with new norms and new fears.

It’s possible that a federal court will rule on Section 702. Although there have been many lawsuits challenging the legality of what the NSA is doing and the constitutionality of the 702 program, no court has ever ruled on those questions. The Bush and Obama administrations successfully argued that defendants don’t have legal standing to sue. That is, they have no right to sue because they don’t know they’re being targeted. If any of the lawsuits can get past that, things might change dramatically.

Meanwhile, much of this is the responsibility of the tech sector. This problem exists primarily because Internet companies collect and retain so much personal data and allow it to be sent across the network with minimal security. Since the government has abdicated its responsibility to protect our privacy and security, these companies need to step up: Minimize data collection. Don’t save data longer than absolutely necessary. Encrypt what has to be saved. Well-designed Internet services will safeguard users, regardless of government surveillance authority.

For the rest of us concerned about this, it’s important not to give up hope. Everything we do to keep the issue in the public eye — and not just when the authority comes up for reauthorization again in 2024 — hastens the day when we will reaffirm our rights to privacy in the digital age.

Read more:

Government lawyers don’t understand the Internet. That’s a problem.

Your WiFi-connected thermostat can take down the whole Internet. We need new regulations.

Why you should side with Apple, not the FBI, in the San Bernardino iPhone case

#####EOF##### Foreign Policy - The Washington Post


Health-care professionals, refugees and aid groups describe the deteriorating conditions.

Trump administration issues sharply worded statement defending U.S. negotiator.

The proposed White House budget would slash 24 percent of spending by the State Department and USAID.


The combative conservative has redefined the job of national security adviser from synthesizer of different views to arbiter of what the president needs to hear.

Kim said he has a “feeling good results will come.”

Russia’s foreign minister claims the U.S. is asking for Moscow's advice as the world awaits a meeting between Trump and Kim.

A think-tank’s assessment says the president works at cross purposes to his administration’s own strategy.

An expert on extrajudicial killings is in Turkey seeking to determine who ordered The Washington Post columnist killed in the Saudi Consulate in October.

Emergency meeting highlights the acrimonious state of U.S. relations with Russia and China.

A. Wess Mitchell points to personal and professional reasons; his departure will create a key vacancy.

Officials juggled funds to come up with enough to meet a full payroll for at least two weeks.

The outgoing Pentagon chief resigned Dec. 20 after a series of disagreements with President Trump.

Some military leaders fear the beginning of a more complicated period under Trump.

Foreign Minister Miro Cerar told U.S. officials that China and Russia are investing heavily in Balkan countries while Washington’s profile is hardly visible.

Though the United States has few allies that agree with its decision to leave the nuclear deal with Iran, the White House is seeking to expand punishment against Iran in other spheres.

Nauert is the latest transplant from Fox News to be elevated to positions in the White House.

Despite U.S. lobbying, all Arab states voted against the measure, suggesting that the White House will have trouble rallying support for the peace proposal it is about to unveil.

The Security Council debate at which she spoke quickly slid into a broad condemnation of Russia's annexation of Crimea four years ago.

The move could limit U.S. assistance and prohibit financial transactions.

The Syrian ambassador vowed to retake the Golan Heights “by peace or by war,” while Israel said the plateau is essential to Israel’s security and it will never withdraw.

Load More



#####EOF##### Facebook is America’s scapegoat du jour - The Washington Post

Facebook is America’s scapegoat du jour


A Facebook employee walks past a sign at Facebook headquarters. (Jeff Chiu/AP)
Columnist

Whenever a new communications medium is born, panic follows. Witness the collective freakout when America learned that a firm called Cambridge Analytica had used some deceptively procured Facebook data to create what the Guardian dubbed “Steve Bannon’s psychological warfare tool.”

Facebook feeds lit up with outraged Hillary Clinton voters announcing that they were shutting down their accounts because Facebook didn’t care about privacy or the integrity of American elections. To judge from the outrage, one might have thought all this had occurred during an actual war. One might also have thought Bannon had a weapon straight out of some paranoid thriller, a machine capable of bending the human mind to its will.

What he actually had? Voter targeting based on what people liked and shared on Facebook.

Cambridge Analytica claimed that from this, it was able to construct precision psychological profiles. But professionals are skeptical that this ever, well, worked.

“The idea that we’re going to profile your personality sounds like a spy novel, and is extremely compelling to would-be spymasters like Steve Bannon,” says Patrick Ruffini, cofounder of Echelon Insights, a conservative polling and analytics firm. The problem, he says, is that psychological traits don’t necessarily give you great insight into voting behavior — at least not any better than other traits, such as socioeconomic status and community of residence, that political campaigns have long targeted.

“Those are the things that most smart analytics people are focused on,” says Ruffini, adding after a pause: “But psychographic targeting sounds cool.”

Facebook, of course, lets you target exactly those boring, old demographic qualities without having to steal any data from them; all you have to do is buy some Facebook ads. Which both 2016 campaigns did extensively without anyone worrying that their minds were being warped.

Sure, but . . . mightn’t the Cambridge Analytica data have given Donald Trump that little extra he needed in Wisconsin and Pennsylvania? Well, Ted Cruz was the firm’s original client in the 2016 election, and he spent nearly as much as the Trump campaign did; if they’re really the masterminds so many seem to believe, how come he’s not president?

In fact, as Kenneth P. Vogel of the New York Times pointed out on Twitter, the firm seems to have been more useful as a way to get donor money out of the Mercer family, who invested in it, than at shifting voter behavior. And as Ruffini points out, targeting on Facebook tends to work best on the folks who are already reliable voters for one party or another; it’s harder to use it to move the moderate voters who gave Trump his narrow victories in swing states.

The freakout seems especially strange when you consider that many of the outraged must at some point have been well aware that this kind of Facebook data had been collected, and used for political purposes, before. Because the Obama campaign did pretty much the same thing. Sasha Issenberg wrote a glowing profile of the operation in 2012. It got very wide circulation on the same social media going nuts over Cambridge Analytica.

To be clear, what Cambridge Analytica did was somewhat worse, because at least the people who signed onto the Obama app that scraped their data knew they were helping a political campaign; the Cambridge Analytica data came from folks who thought they were just taking a personality quiz. But both apps gathered data not just on those users, but on millions of their friends, who had not consented to have their data used. And while the Obama campaign’s use did not, as Cambridge Analytica is alleged to have done, violate Facebook’s terms of service, that’s because Facebook appears to have just smiled and let it get away with it, which is itself troubling. 

Facebook no longer allows anyone to do this, thankfully, and hasn’t for years, as Julian Sanchez, an analyst who covers surveillance and privacy for the Cato Institute, points out. Which makes it all the weirder that this became so controversial now. If you really care about privacy, and unbiased American elections, wasn’t it an equal problem for democracy if Barack Obama used this kind of data mining to beat Mitt Romney?

The reality is that media panics are never just about the media itself; they are a scapegoat onto which we unload our larger anxieties. An 18th-century English upper class, worried about social and economic upheaval, blamed novels for seducing women into rebelling against their roles; an American intelligentsia fretful about the “empty consumerism” of affluent postwar America panicked that subliminal advertising was brainwashing people. Thus are vast and complex social forces reduced to something malign, but at least manageable.

And so with Facebook: A liberal cultural elite lost an election they thought was in the bag. Trump did an end run around all the institutional gatekeepers who were supposed to keep someone like him out of office. The idea that this could happen is terrifying enough for the old guard, the idea that this could happen, and there might not be any way to stop it, is intolerable. Thus, we must find, not merely a cause, but a cause that is amenable to intervention. Facebook had the grave misfortune of being the nearest goat to hand.

This starts to look like an exercise in what the philosopher Robert Nozick once called “normative sociology”: the study of what the causes of things ought to be. The answers such study produces are undoubtedly morally satisfying. But they are not very useful if you actually want to change the world. Those of us who opposed the election of Donald Trump in 2016 cannot afford such self-indulgence. Not unless we’re ready for a rerun in 2020.

Read more from Megan McArdle’s archive, follow her on Twitter or subscribe to her updates on Facebook.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Don’t fear the TSA cutting airport security. Be glad that they’re talking about it. - The Washington Post
PostEverything

Don’t fear the TSA cutting airport security. Be glad that they’re talking about it.

We need to evaluate airport security based on concrete costs and benefits, and not continue to implement security theater based on fear.


Passengers wait in line to check in at McCarran International Airport in Las Vegas on June 29. (John Locher/AP)
Bruce Schneier is a security technologist and a lecturer at the Kennedy School of Government at Harvard University. His new book, "Click Here to Kill Everybody," will be published in September.

Last week, CNN reported that the Transportation Security Administration is considering eliminating security at U.S. airports that fly only smaller planes — 60 seats or fewer. Passengers connecting to larger planes would clear security at their destinations.

To be clear, the TSA has put forth no concrete proposal. The internal agency working group’s report obtained by CNN contains no recommendations. It’s nothing more than 20 people examining the potential security risks of the policy change. It’s not even new: The TSA considered this back in 2011, and the agency reviews its security policies every year. But commentary around the news has been strongly negative. Regardless of the idea’s merit, it will almost certainly not happen. That’s the result of politics, not security: Sen. Charles E. Schumer (D-N.Y.), one of numerous outraged lawmakers, has already penned a letter to the agency saying that “TSA documents proposing to scrap critical passenger security screenings, without so much as a metal detector in place in some airports, would effectively clear the runway for potential terrorist attacks.” He continued, “It simply boggles the mind to even think that the TSA has plans like this on paper in the first place.”

We don’t know enough to conclude whether this is a good idea, but it shouldn’t be dismissed out of hand. We need to evaluate airport security based on concrete costs and benefits, and not continue to implement security theater based on fear. And we should applaud the agency’s willingness to explore changes in the screening process.

There is already a tiered system for airport security, varying for both airports and passengers. Many people are enrolled in TSA PreCheck, allowing them to go through checkpoints faster and with less screening. Smaller airports don’t have modern screening equipment like full-body scanners or CT baggage screeners, making it impossible for them to detect some plastic explosives. Any would-be terrorist is already able to pick and choose his flight conditions to suit his plot.

Over the years, I have written many essays critical of the TSA and airport security, in general. Most of it is security theater — measures that make us feel safer without improving security. For example, the liquids ban makes no sense as implemented, because there’s no penalty for repeatedly trying to evade the scanners. The full-body scanners are terrible at detecting the explosive material PETN if it is well concealed — which is their whole point.

There are two basic kinds of terrorists. The amateurs will be deterred or detected by even basic security measures. The professionals will figure out how to evade even the most stringent measures. I’ve repeatedly said that the two things that have made flying safer since 9/11 are reinforcing the cockpit doors and persuading passengers that they need to fight back. Everything beyond that isn’t worth it.

It’s always possible to increase security by adding more onerous — and expensive — procedures. If that were the only concern, we would all be strip-searched and prohibited from traveling with luggage. Realistically, we need to analyze whether the increased security of any measure is worth the cost, in money, time and convenience. We spend $8 billion a year on the TSA, and we’d like to get the most security possible for that money.

This is exactly what that TSA working group was doing. CNN reported that the group specifically evaluated the costs and benefits of eliminating security at minor airports, saving $115 million a year with a “small (nonzero) undesirable increase in risk related to additional adversary opportunity.” That money could be used to bolster security at larger airports or to reduce threats totally removed from airports.

We need more of this kind of thinking, not less. In 2017, political scientists Mark Stewart and John Mueller published a detailed evaluation of airport security measures based on the cost to implement and the benefit in terms of lives saved. They concluded that most of what our government does either isn’t effective at preventing terrorism or is simply too expensive to justify the security it does provide. Others might disagree with their conclusions, but their analysis provides enough detailed information to have a meaningful argument.

The more we politicize security, the worse we are. People are generally terrible judges of risk. We fear threats in the news out of proportion with the actual dangers. We overestimate rare and spectacular risks, and underestimate commonplace ones. We fear specific “movie-plot threats” that we can bring to mind. That’s why we fear flying over driving, even though the latter kills about 35,000 people each year — about a 9/11’s worth of deaths each month. And it’s why the idea of the TSA eliminating security at minor airports fills us with fear. We can imagine the plot unfolding, only without Bruce Willis saving the day.

Very little today is immune to politics, including the TSA. It drove most of the agency’s decisions in the early years after the 9/11 terrorist attacks. That the TSA is willing to consider politically unpopular ideas is a credit to the organization. Let’s let them perform their analyses in peace.

#####EOF##### Legislators struggle with tech. That’s why we need the Office of Technology Assessment. - The Washington Post
The Post's View

Legislators struggle with tech. That’s why we need the Office of Technology Assessment.


Facebook CEO Mark Zuckerberg appears for a hearing at the Hart Senate Office Building on April 10 in Washington. (Matt McClain/The Washington Post)

CONGRESS NEEDS to hire more teachers — for itself. As high-profile hearings have made clear this year, lawmakers struggle to understand the Internet platforms that dominate online life, and given the limited resources the legislative branch has to build up its knowledge base, that is no surprise. Thankfully, recent appropriations bills offer reason for hope.

Over the past two decades, two Democratic physicists-turned-congressmen have led the charge for the resurrection of the Office of Technology Assessment, which from 1972 to its 1995 defunding provided representatives with nonpartisan analysis of science and technical issues. Though Republicans have been largely resistant to the measure, the current legislation requires the Congressional Research Service to examine the need for an additional entity to dispense technological guidance. The Government Accountability Office has also been instructed to evaluate how to give its tech assessment program more prominence.

It is crucial for legislators to grapple with technological issues on a higher level than most have so far proved themselves capable. OTA was established in recognition of technology’s “increasingly extensive, pervasive, and critical” impact on all of us. That has not changed. Technology permeates society far beyond the buzziest issues surrounding Silicon Valley’s most powerful companies, reaching areas from cybersecurity to biomedical research to space exploration. When Congress steps in, it needs to understand what it is doing — whether it’s reworking regulatory frameworks to accommodate innovations or figuring out the most efficient way to administer programs such as Social Security in the digital age.

Some believe that altering the GAO’s tech assessment program could bring Congress all the benefits OTA used to offer, and that doing so would be politically easier than recreating a separate office. GAO says it is starting to make changes. Yet GAO’s institutional culture centers on audits and investigations, and it lacks the hallmarks of OTA in its heyday. Those include a larger permanent staff of subject experts with whom legislators can build relationships, as well as the independence to better compete for resources.

In fact, OTA furnished Congress with exactly what it needs right now: careful analysis of the toughest tech issues of the day and the policy options to address them. It was not perfect — OTA was often criticized for moving too slowly, and in the digital age, speed matters more than ever — but its absence is painfully felt. Democrats may struggle to secure Republican buy-in to restore the office, not to mention the funding necessary for the venture to truly succeed. They should try anyway. Knowing what one is talking about should not be a partisan issue.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### The Cybersecurity 202: Medical devices are woefully insecure. These hospitals and manufacturers want to fix that - The Washington Post
PowerPost

The Cybersecurity 202: Medical devices are woefully insecure. These hospitals and manufacturers want to fix that

THE KEY

Medical devices -- such as pacemakers, insulin pumps and MRI machines -- are increasingly vulnerable to hacking. As of today, however, there’s no federal mandate for those devices to have cybersecurity protections.

A government-backed coalition of hospitals and medical device manufacturers took matters into their own hands on Monday. They released a 53-page “joint security plan” outlining a slew of low-hanging fruit protections manufacturers should implement and hospitals should demand.

The plan released by the Healthcare Sector Coordinating Council — a liaison on security issues between industry and government — won’t alone fix the cybersecurity problems plaguing the health-care industry. It effectively amounts to a voluntary do-list for manufacturers. 

Still, the council's executive director Greg Garcia tells me it marks a sea change: Companies and hospitals are finally signaling they are willing to cooperate on fixing the problem, rather than saying it's the other's responsibility to fix. 

“The big picture is this is truly a recognition that this is a shared responsibility,” Garcia told me. “The circular finger pointing should end.”

The plan is a sign the medical device industry and hospitals are unwilling to wait for Congress to catch up to the threats. Data theft and malware attacks have rocked the health-care industry in recent years, compromising patients’ data and even threatening their lives. The 2015 breach at health insurer Anthem compromised the information of nearly 80 million people, for example, while a 2017 wave of ransomware attacks locked up patient records at 16 UK hospitals, forcing them to divert patients that needed emergency care. 

Cybersecurity researchers have also raised alarms about vulnerabilities in implantable medical devices that hackers could exploit to injure or even kill patients. Former vice president Dick Cheney famously had his internal pacemaker taken offline because of hacking fears.

Garcia himself acknowledges the new plan, which was drafted by about 60 medical organizations with the Mayo Clinic, the Food and Drug Administration and the medical device company BD in the lead, won't fix these vulnerabilities right away.

Yet it does advise manufacturers to describe to hospitals precisely how they’ll scan for new cyber vulnerabilities in their devices, how they’ll patch them and when. Manufacturers should also tell hospitals how long they’ll support devices by patching newfound vulnerabilities and when hospitals should plan for those devices to reach the end of their usable lives, according to the plan.

It comes one month after the coordinating council and the Department of Health and Human Services released a separate guide, basically outlining hospitals’ cybersecurity responsibilities, including what they should expect from device manufacturers.  

“This begins to resolve the tension between medical device makers and hospitals,” Garcia said, “because device makers have not been building security in over the past several years and, meanwhile, hospitals have not been doing enough to secure their broader networks.”

There are four big reasons cybersecurity is lagging in the health-care sector, Garcia told me.

First off, regulations including the Health Insurance Portability and Accountability Act, a major privacy law, put strict limits around third-party organizations accessing patient data. That makes it difficult for device manufacturers to reach into hospital systems that hold that data to patch and update their software with new protections.

But, second, hospitals are often underequipped to patch the devices themselves, because they lack ready cash and work with far tighter profit margins than banks or major telecommunications companies. That means many smaller hospitals can’t afford chief information security officers — let alone full cybersecurity teams.

Third, many medical devices such as MRI machines are built to last a decade or longer, which means that even if they're built with cybersecurity in mind, they’ll be facing a whole new generation of hacking threats at the end of their life cycles.

Finally, criminal hackers started targeting health care are later than other sectors, such as financial services, where stolen information could be converted more quickly into cash. When they did arrive, though, they came in force.

“Quite frankly, it caught a lot of the health-care sector flat-footed,” when that changed about seven years ago, Garcia told me. “It was a bit of a slow-motion ambush.”

PINGED, PATCHED, PWNED

PINGED: Security researchers discovered an iPhone bug that allows users to call someone using FaceTime and listen in on their phone’s audio before the person has accepted or rejected the FaceTime call, 9to5Mac reported. And now Apple has turned off group chats on FaceTime, according to the Associated Press. "Apple’s online support page on Tuesday said there was a technical issue with the application and that Group Facetime 'is temporarily unavailable,'" the AP reported

News of the bug quickly went viral on Twitter. 

President Trump's former cybersecurity coordinator Rob Joyce, now at the National Security Agency, issued this warning:

TechCrunch's security editor:

PATCHED: Sen. Amy Klobuchar (D-Minn.) and other Senate Democrats want to know how much damage the partial government shutdown inflicted on government computer networks. In a letter today to Homeland Security Secretary Kirstjen Nielsen and National Security Agency Director Paul Nakasone, the lawmakers ask about “any suspicious activity” that occurred during the shutdown and what measures the government took  during the shutdown to prevent cyberattacks on federal agencies.

Klobuchar is leading the letter, which was shared with The Cybersecurity 202. It was also signed by Sens. Edward J. Markey (D-Mass.), Tom Udall (D-N.M.), Catherine Cortez Masto (D-Nev.) and Cory Booker (D-N.J.). The letter also notes that several Web security certificates used by federal agencies expired during the shutdown and asks DHS to work wtih other agencies to prevent that happening in future shutdowns. 

"Experts from multiple cybersecurity firms have warned that these lapses in cybersecurity provide an opportunity for adversaries and cybercriminals to carry out attacks against the U.S. government," the letter states, noting reports that Chinese hackers penetrated the Federal Elections Commission website during a 2013 government shutdown. 

PWNED: The Trump administration continued its crackdown on Chinese telecommunications giant Huawei Monday, a company that U.S. officials have said could be a platform for Chinese spying. A 13-count indictment in New York charged Huawei, two affiliates and Meng Wanzhou, Huawei's chief financial officer, The Washington Post's Ellen Nakashima and Devlin Barrett reported. FBI Director Christopher A. Wray said that companies like Huawei â€œpose a dual threat to both our economic and national security, and the magnitude of these charges make clear just how seriously the FBI takes this threat.”

The indictment contains allegations of bank and wire fraud, and the company is also accused of violating U.S. sanctions on Iran and conspiring to obstruct justice, according to my colleagues. Another 10-count indictment in Washington State alleged that Huawei conspired to steal technical details about a phone-testing robot from T-Mobile.

Sen. Mark R. Warner (D-Va.), the Senate Intelligence Committee's vice chairman, praised the administration for the move. â€œIt has been clear for some time that Huawei poses a threat to our national security, and I applaud the Trump Administration for taking steps to finally hold the company accountable,” Warner said in a statement. Sen. Roger Wicker (R-Miss.), the chairman of the Senate Commerce Committee, said in a statement that the â€œ indictments of Huawei officials confirm the risk of China’s involvement in transformational, next generation technology.”

PUBLIC KEY

— The U.S. Court of Appeals for the 4th Circuit today is set to hear an appeal by Sharyl Attkisson, who in a 2015 lawsuit alleged that she was the subject of illegal government surveillance when she was an investigative reporter at CBS News, the Associated Press's Denise Lavoie reported. A federal judge had dismissed the lawsuit. Attkisson says that two computer forensics teams found unauthorized communications on her laptop connected to an IP address belonging to the U.S. Postal Service, "indicating unauthorized surveillance,” according to the AP. Attkisson made her allegations public in 2013 and filed a complaint with the Justice Department's Inspector General. â€œThe FBI and DOJ publicly stated that they had no knowledge of any electronic surveillance of Attkisson or her family,” according to Lavoie.

— After the Federal Communications Commission was asked whether it would investigate the sale by AT&T, T-Mobile and Sprint of their cellphone users' location data to third-party companies, the agency said it is “going where the facts lead us” and added that it would not “comment publicly in the middle of an investigation,” according to a tweet from Motherboard's Joseph Cox.

— More cybersecurity news from the public sector:

An online marketplace that facilitated more than $68 million in fraud and cybercrime has been shut down following an international law enforcement operation, the U.S. Department of Justice announced Monday.
CyberScoop
PRIVATE KEY
Facebook is planning a dedicated effort to fend off interference in the European Union’s parliamentary election campaign this spring, part of a broader effort to defend against political interference.
The Wall Street Journal
Opinions
New technology essentially makes the terrorist group immune from removal.
Rita Katz
THE NEW WILD WEST

— The European Union’s digital security agency said in a report that Iran will probably increase its cyber espionage operations as Tehran’s relations with Western countries deteriorate, Reuters reported. The agency, called the European Union Agency for Network and Information Security, also said that state-sponsored hackers are among the biggest threats to the E.U.’s digital security. “Newly imposed sanctions on Iran are likely to push the country to intensify state-sponsored cyber threat activities in pursuit of its geopolitical and strategic objectives at a regional level,” the agency’s report said, according to Reuters.

— Europol and its partners are targeting thousands of users of webstresser.org, a site for launching distributed denial-of-service attacks that law enforcement took down last year, according to TechCrunch’s Zack Whittaker. “As part of the collective law enforcement effort from the U.K., U.S., and many European partners in Operation Power Off, Europol obtained a list of its 151,000 registered users,” TechCrunch reported.

— More cybersecurity news from abroad:

Perhaps more than any other nation-state, North Korea-linked hackers have shown no limits in what they will target â€“ from a Hollywood entertainment company to a Bangladeshi bank.
CyberScoop
The counter drone measures come in the wake of the chaos caused at London's Gatwick airport before Christmas.
The Sydney Morning Herald
ZERO DAYBOOK

Today:

Coming soon:

  • BSidesPhilly cybersecurity conference in Philadelphia on Friday.
  • B-Sides Tampa cybersecurity conference in Tampa on Saturday.
EASTER EGGS

Amid political turmoil, Venezuelans express desire for a better future:

Trump never caves. Until he does.

#####EOF##### The Washington Post: My Newsletters
#####EOF##### Anne Gearan - The Washington Post

Anne Gearan

Washington, D.C.

White House reporter Education: Allegheny College, BA in English and History
 Anne Gearan is a White House correspondent for The Washington Post, with a focus on foreign policy and national security. She covered the Hillary Clinton campaign and the State Department for The Post before joining the White House beat. She joined the paper in 2012 from the Associated Press, where she served as chief diplomatic correspondent, Pentagon correspondent, White House reporter and national security editor. She also covered the Supreme Court 
Honors & Awards:
  • Associated Press Virginia staffer of the year
Professional Affiliations: Women's Foreign Policy Group, Medill National Security Washington semester mentor Foreign languages spoken: French
Latest from Anne Gearan

Jens Stoltenberg echoed the president in saying defense spending among alliance members must be fair.

  • Apr 3, 2019

Trump did not invite alliance leaders to Washington to mark the 70th anniversary this week, a relief to members who feared another blowup.

  • Apr 2, 2019

Three weeks before the Israeli election, the president backed a key political priority for Prime Minister Benjamin Netanyahu.

  • Mar 21, 2019

The effort is aimed at amplifying the pro-Brexit message among Britons even though the United States has no say in the matter.

  • Mar 20, 2019

Jair Bolsonaro rode a Trump-like wave of populist anger to topple a left-wing government, and reaps rewards at the White House.

  • Mar 19, 2019

The weekend barrage of presidential tweets came after a difficult week and ahead of what may be another for Trump.

  • Mar 17, 2019

Twelve Senate Republicans joined Democrats to challenge Trump over his move to circumvent Congress.

  • Mar 15, 2019

The annual ritual of the American president hosting the leader of Ireland in celebration of the intertwined history of the two countries dates to 1959, and evokes a political era of patronage and old-school Irish American pols.

  • Mar 14, 2019

The president’s “cost plus 50” formula has struck fear in the hearts of countries that host American troops.

  • Mar 9, 2019

President Trump announced the trip at the start of a Cabinet meeting, at which he also renewed his false claim that Democrats are responsible for “loopholes” that led to his family separation policy at the U.S.-Mexico border.

  • Jun 21, 2018
Load More
#####EOF##### Deanna Paul - The Washington Post

Deanna Paul

Washington, D.C.

Reporter covering national and breaking news Education: Columbia Graduate School of Journalism, MS in Journalism; Fordham Law School, JD; University of Pennsylvania, BS in History
 Deanna Paul covers national and breaking news for The Washington Post. She recently graduated with honors from Columbia University's Graduate School of Journalism. 

Before joining The Post, Paul spent six years as a New York City prosecutor. 
Latest from Deanna Paul

The Timmothy Pitzen case, one of the country's most confounding child disappearances, may soon be solved.

  • Apr 4, 2019

The lawsuit alleges the government’s pre-review process for published work is an unconstitutional “system of censorship.”

  • Apr 2, 2019

With increasing misinformation and malevolent actors, what responsibility do social networks have in the 2020 election cycle?

  • Mar 30, 2019

Prosecutors have defended their decision to drop charges against the ‘Empire’ star.

  • Mar 28, 2019

President says federal agencies will look into the actor's case, which involved charges of filing a false report that were dismissed Tuesday.

  • Mar 28, 2019

The dismissal of the "Empire" actor's case left unanswered questions. Here are some answers.

  • Mar 28, 2019

A roundup of news from across the nation.

  • Mar 27, 2019

Jake Patterson pleaded guilty to abducting 13-year-old Jayme Closs and killing her parents.

  • Mar 27, 2019

Is flatulence a new form of bullying? Yes, says the man battling his employer in court.

  • Mar 26, 2019

A husband and wife entered a trucking business in Bakersfield. Then the husband killed her, four men and himself, police say.

  • Sep 13, 2018
Load More
#####EOF##### Yes, Facebook made mistakes in 2016. But we weren’t the only ones. - The Washington Post

Yes, Facebook made mistakes in 2016. But we weren’t the only ones.


Facebook Chief Operating Officer Sheryl Sandberg testifies during a Senate Intelligence Committee hearing on Capitol Hill on Sept. 5. (Drew Angerer/Getty Images)

Alex Stamos is a Hoover fellow and adjunct professor at Stanford University. He served as the chief security officer at Facebook until August.

Yup, Sheryl Sandberg yelled at me.

It was the day after I briefed Facebook’s Board of Directors on an unprecedented and troubling finding on our platform. Combing through billions of accounts, my colleagues and I had discovered a web of fake personae that we could confidently tie to Russia. I told the board the difficult truth: I had no confidence that we’d found out everything the Russians were up to, and it was quite possible that things would get worse before we built the teams and invented the technology necessary to stop it. Sandberg — as reported in this past week’s New York Times investigation — felt blindsided by this. (She later apologized.)

At the time, technology companies were so enamored with the utility of our own products and so focused on sophisticated attacks from U.S. adversaries such as Russia and China that we overlooked less advanced but still effective propaganda operations. After the election, and having provided our detailed findings to the FBI and special counsel Robert S. Mueller III, Facebook stuck to a public-communications strategy of minimization and denial. It was finally jettisoned in early 2018, but the damage to trust has been massive and will take years to repair. To be clear, no one at the company ever told me not to examine Russian activity, nor did anyone attempt to lie about our findings, but Facebook should have responded to these threats much earlier and handled disclosure in a more transparent manner.

Yet Facebook’s shortcomings do not stand alone. The massive U.S. intelligence community failed to provide actionable intelligence on Russia’s ­information-warfare goals and capabilities before the election and offered a dearth of assistance afterward. Technology companies can build tools and teams to look inward on their products, but they will never have true geostrategic insight or ability to penetrate hostile countries. This relationship has greatly improved in 2018, mostly due to the initiative of hard-working intelligence professionals. Our elected officials, however, can claim little credit. Lawmakers’ public grandstanding at investigative hearings stands in stark contrast to their failure to establish facts, effectively oversee the executive branch and provide for the common defense.

We must also remember that in the summer of 2016, every major media outlet rewarded the hackers of the Russian Main Intelligence Directorate (GRU) with thousands of collective stories drawn from the stolen emails of prominent Democrats. The sad truth is that blocking Russian propaganda would have required Facebook to ban stories from the New York Times, the Wall Street Journal and cable news — not to mention this very paper. Since the election of Donald Trump, print and television news organizations have staffed up and provided a critical service to Americans, but they have never adequately grappled with their culpability in empowering Russia’s election interference.

It is time for us to come together to protect our society from future information operations. While it appears Russia and other U.S. adversaries sat out the 2018 midterms, our good fortune is unlikely to extend through a contentious Democratic presidential primary season and raucous 2020 election.

First, Congress needs to codify standards around political advertising. The current rules restricting the use of powerful online advertising platforms have been adopted voluntarily and by only a handful of companies. Congress needs to update Nixon-era laws to require transparency and limit the ability of all players, including legitimate domestic actors, to micro-target tiny segments of the population with divisive political narratives. It would be great to see Facebook, Google and Twitter propose helpful additions to legislation instead of quietly opposing it.

Second, we need to draw a thoughtful line between the responsibilities of government and the large technology companies. The latter group will always need to act in a quasi-governmental manner, making judgments on political speech and operating teams in parallel to the U.S. intelligence community, but we need more clarity on how these companies make decisions and what powers we want to reserve to our duly elected government. Many areas of cybersecurity demand cooperation between government and corporations, and our allies in France and Germany provide models of how competent defensive cybersecurity responsibility can be built in a democracy.

While many of the individual reporters I have spoken to are now warier of manipulation, it is unclear whether the U.S. media would handle the strategic release of stolen emails any differently today. This might be a fundamental vulnerability in the free press, but it would be reassuring to see leading newsrooms publish their standards on how they might cover newsworthy data leaks without amplifying the messages of the United States’ enemies.

Finally, U.S. citizens must adjust to a media environment in which several dozen gatekeepers no longer control what is newsworthy. While the platforms that bring hundreds of new media outlets to your phone need to improve protections against abuse, in a free society we will always be vulnerable to the injection of narratives from the enemies of democracy, both foreign and domestic. The last line of defense will always be citizens who are willing to question what they see and hear, even when it means questioning our own beliefs.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Banning Chinese phones won’t fix security problems with our electronic supply chain - The Washington Post
PostEverything

Banning Chinese phones won’t fix security problems with our electronic supply chain

The real issue is overall trust.


You can’t buy phones like this one on U.S. military bases anymore. (Mike Nelson/EPA)
Bruce Schneier is a security technologist and a lecturer at the Kennedy School of Government at Harvard University. His new book, "Click Here to Kill Everybody," will be published in September.

Earlier this month, the Pentagon stopped selling phones made by the Chinese companies ZTE and Huawei on military bases because they might be used to spy on their users.

It’s a legitimate fear, and perhaps a prudent action. But it’s just one instance of the much larger issue of securing our supply chains.

All of our computerized systems are deeply international, and we have no choice but to trust the companies and governments that touch those systems. And while we can ban a few specific products, services or companies, no country can isolate itself from potential foreign interference.

In this specific case, the Pentagon is concerned that the Chinese government demanded that ZTE and Huawei add “back doors” to their phones that could be surreptitiously turned on by government spies or cause them to fail during some future political conflict. This tampering is possible because the software in these phones is incredibly complex. It’s relatively easy for programmers to hide these capabilities, and correspondingly difficult to detect them.

This isn’t the first time the United States has taken action against foreign software suspected to contain hidden features that can be used against us. Last December, President Trump signed into law a bill banning software from the Russian company Kaspersky from being used within the U.S. government. In 2012, the focus was on Chinese-made Internet routers. Then, the House Intelligence Committee concluded: “Based on available classified and unclassified information, Huawei and ZTE cannot be trusted to be free of foreign state influence and thus pose a security threat to the United States and to our systems.”

Nor is the United States the only country worried about these threats. In 2014, China reportedly banned anti-virus products from both Kaspersky and the U.S. company Symantec, based on similar fears. In 2017, the Indian government identified 42 smartphone apps that China subverted. Back in 1997, the Israeli company Check Point was dogged by rumors that its government added back doors into its products; other of that country’s tech companies have been suspected of the same thing. Even al-Qaeda was concerned; 10 years ago, a sympathizer released the encryption software Mujahedeen Secrets, claimed to be free of Western influence and back doors.

If a country doesn’t trust another country, then it can’t trust that country’s computer products.

But this trust isn’t limited to the country where the company is based. We have to trust the country where the software is written — and the countries where all the components are manufactured. In 2016, researchers discovered that many different models of cheap Android phones were sending information back to China. The phones might be American-made, but the software was from China. In 2016, researchers demonstrated an even more devious technique, where a back door could be added at the computer chip level in the factory that made the chips — without the knowledge of, and undetectable by, the engineers who designed the chips in the first place. Pretty much every U.S. technology company manufactures its hardware in countries such as Malaysia, Indonesia, China and Taiwan.

We also have to trust the programmers. Today’s large software programs are written by teams of hundreds of programmers scattered around the globe. Back doors, put there by we-have-no-idea-who, have been discovered in Juniper firewalls and D-Link routers, both of which are U.S. companies. In 2003, someone almost slipped a very clever back door into Linux. Think of how many countries’ citizens are writing software for Apple or Microsoft or Google.

We can go even farther down the rabbit hole. We have to trust the distribution systems for our hardware and software. Documents disclosed by Edward Snowden showed the National Security Agency installing back doors into Cisco routers being shipped to the Syrian telephone company. There are fake apps in the Google Play store that eavesdrop on you.

Russian hackers subverted the update mechanism of a popular brand of Ukrainian accounting software to spread the NotPetya malware.

In 2017, researchers demonstrated that a smartphone can be subverted by installing a malicious replacement screen.

I could go on. Supply-chain security is an incredibly complex problem. U.S.-only design and manufacturing isn’t an option; the tech world is far too internationally interdependent for that. We can’t trust anyone, yet we have no choice but to trust everyone. Our phones, computers, software and cloud systems are touched by citizens of dozens of different countries, any one of whom could subvert them at the demand of their government. And just as Russia is penetrating the U.S. power grid so they have that capability in the event of hostilities, many countries are almost certainly doing the same thing at the consumer level.

We don’t know whether the risk of Huawei and ZTE equipment is great enough to warrant the ban. We don’t know what classified intelligence the United States has, and what it implies. But we do know that this is just a minor fix for a much larger problem. It’s doubtful that this ban will have any real effect. Members of the military, and everyone else, can still buy the phones. They just can’t buy them on U.S. military bases. And while the U.S. might block the occasional merger or acquisition, or ban the occasional hardware or software product, we’re largely ignoring that larger issue. Solving it borders on somewhere between incredibly expensive and realistically impossible.

Perhaps someday, global norms and international treaties will render this sort of device-level tampering off-limits. But until then, all we can do is hope that this particular arms race doesn’t get too far out of control.

#####EOF##### Missy Ryan - The Washington Post

Missy Ryan

Washington, D.C.

Reporter covering the Pentagon, military issues and national security Education: Georgetown University, BA in English; Harvard University, master's in public policy
 Missy Ryan writes about the Pentagon, military issues and national security for The Washington Post. She joined The Post in 2014 from Reuters, where she reported on U.S. national security and foreign policy issues. She has reported from Iraq, Egypt, Libya, Lebanon, Yemen, Afghanistan, Pakistan, Mexico, Peru, Argentina and Chile. 
Honors & Awards:
  • New York Press Club award for political reporting, 2012
Professional Affiliations: Council on Foreign Relations Foreign languages spoken: Spanish, Arabic
Latest from Missy Ryan

Outgoing Centcom commander says United States must avoid abrupt shifts that would risk allowing insurgent to rebound.

  • Mar 27, 2019

Latest announcement reveals a disconnect between White House depiction and eyewitness reports.

  • Mar 22, 2019

  • Mar 22, 2019

Syrian forces believe suspects are linked to a suicide bombing in January in the city of Manbij.

  • Mar 19, 2019

It wasn't immediately clear whether the Defense Department inspector general would open an investigation into the acting defense secretary's handling of matters related to his former employer.

  • Mar 13, 2019

The change to an executive order removes the requirement for public accounting on CIA drone strikes.

  • Mar 6, 2019

Defense officials are examining a proposal to shift $3.6 billion in military construction funds to fortify the border.

  • Feb 23, 2019

The decision is a partial reversal of the president’s plan to end the U.S. military mission there.

  • Feb 22, 2019

White House announcement comes after European allies said they would withdraw forces unless the U.S. remained.

  • Feb 21, 2019

The phrase 'climate change' appeared in a draft Pentagon report 23 times. The final version used it once.

  • May 10, 2018
Load More
#####EOF##### Crown prince sought to lure Khashoggi back to Saudi Arabia and detain him, U.S. intercepts show - The Washington Post

Crown prince sought to lure Khashoggi back to Saudi Arabia and detain him, U.S. intercepts show


Jamal Khashoggi with his fiancee, Hatice Cengiz. (Courtesy of Hatice Cengiz)

The crown prince of Saudi Arabia, Mohammed bin Salman, ordered an operation to lure Washington Post columnist Jamal Khashoggi back to Saudi Arabia from his home in Virginia and then detain him, according to U.S. intelligence intercepts of Saudi officials discussing the plan.

The intelligence, described by U.S. officials familiar with it, is another piece of evidence implicating the Saudi regime in Khashoggi’s disappearance last week after he entered the Saudi Consulate in Istanbul. Turkish officials say that a Saudi security team lay in wait for the journalist and killed him.

Khashoggi was a prominent critic of the Saudi government and Mohammed in particular. Several of Khashoggi’s friends said that over the past four months, senior Saudi officials close to the crown prince had called Khashoggi to offer him protection, and even a high-level job working for the government, if he returned to his home country.

Khashoggi, however, was skeptical of the offers. He told one friend that the Saudi government would never make good on its promises not to harm him.

“He said: ‘Are you kidding? I don’t trust them one bit,’ ” said Khaled Saffuri, an Arab American political activist, recounting a conversation he had with Khashoggi in May, moments after Khashoggi had received a call from Saud al-Qahtani, an adviser to the royal court.

The intelligence pointing to a plan to detain Khashoggi in Saudi Arabia has fueled speculation by officials and analysts in multiple countries that what transpired at the consulate was a backup plan to capture Khashoggi that may have gone wrong.

A former U.S. intelligence official — who, like others, spoke on the condition of anonymity to discuss the sensitive matter — noted that the details of the operation, which involved sending two teams totaling 15 men, in two private aircraft arriving and departing Turkey at different times, bore the hallmarks of a “rendition,” in which someone is extra­legally removed from one country and deposited for interrogation in another.

But Turkish officials have concluded that whatever the intent of the operation, Khashoggi was killed inside the consulate. Investigators have not found his body, but Turkish officials have released video surveillance footage of Khashoggi entering the consulate on the afternoon of Oct. 2. There is no footage that shows him leaving, they said.

The intelligence about Saudi Arabia’s earlier plans to detain Khashoggi have raised questions about whether the Trump administration should have warned the journalist that he might be in danger.

Intelligence agencies have a “duty to warn” people who might be kidnapped, seriously injured or killed, according to a directive signed in 2015. The obligation applies regardless of whether the person is a U.S. citizen. Khashoggi was a U.S. resident.

“Duty to warn applies if harm is intended toward an individual,” said a former senior intelligence official. But that duty also depends on whether the intelligence clearly indicated Khashoggi was in danger, the former official said.

“Capturing him, which could have been interpreted as arresting him, would not have triggered a duty-to-warn obligation,” the former official said. “If something in the reported intercept indicated that violence was planned, then, yes, he should have been warned.”

The Office of the Director of National Intelligence, which oversees the warning process, declined to comment on whether Khashoggi had been contacted.

Administration officials have not commented on the intelligence reports that showed a Saudi plan to lure Khashoggi.

“Though I cannot comment on intelligence matters, I can say definitively the United States had no advance knowledge of [Khashoggi’s] disappearance,” deputy State Department spokesman Robert Palladino told reporters Wednesday. Asked whether the U.S. government would have had a duty to warn Khashoggi if it possessed information that he was in jeopardy, Palladino declined to answer what he called a “hypothetical question.”

It was not clear to officials with knowledge of the intelligence whether the Saudis discussed harming Khashoggi as part of the plan to detain him in Saudi Arabia.

But the intelligence had been disseminated throughout the U.S. government and was contained in reports that are routinely available to people working on U.S. policy toward Saudi Arabia or related issues, one U.S. official said.

The intelligence poses a political problem for the Trump administration because it implicates Mohammed, who is particularly close to Jared Kushner, President Trump’s son-in-law and senior adviser.

On Wednesday, Kushner and national security adviser John Bolton spoke by phone with the crown prince, but White House officials said the Saudis provided little information.

Trump has grown frustrated, two officials said, after initially reacting slowly to Khashoggi’s disappearance. Earlier this week, he said he had no information about what had happened to the journalist.

White House officials have begun discussing how to force Saudi Arabia to provide answers and what punishment could be meted out if the government there is found responsible.

Lawmakers on Capitol Hill have reacted harshly to the disappearance. On Wednesday, a bipartisan group of senators asked Trump to impose sanctions on anyone found responsible for Khashoggi’s disappearance, including Saudi leaders.

Sen. Lindsey O. Graham (R-S.C.), perhaps the president’s closest ally in the Senate, predicted a “bipartisan tsunami” of action if the Saudis were involved and said that Khashoggi’s death could alter the nature of relations between the two countries.

Kushner’s relationship with Mohammed, known within national security agencies by the initials MBS, has long been the subject of suspicion by some American intelligence officials.

Kushner and Mohammed have had private, one-on-one phone calls that were not always set up through normal channels so the conversations could be memorialized and Kushner could be properly briefed.

For all his criticism of the Saudi regime, Khashoggi was not always opposed to Mohammed’s policies. Khashoggi credited the young leader for what he saw as positive changes, including loosening Saudi cultural restrictions.

Khashoggi often expressed affection for his homeland, even while saying he did not believe it was safe for him. One person in contact with the crown prince, speaking on the condition of anonymity to preserve the relationship, said Khashoggi last year asked him to give a message to Mohammed saying he needed someone like Khashoggi as an adviser.

When he transmitted the message, this person said, the crown prince said that Khashoggi was tied to the Muslim Brotherhood and to Qatar, both Saudi adversaries, and that the arrangement would never happen.

Two other friends of Khashoggi said that at least twice he received cordial phone calls from Qahtani, the adviser to the prince, conveying friendly messages on his behalf.

In one of the calls, in September 2017, Qahtani said that Mohammed had been “very happy” to see Khashoggi post a message praising the kingdom after the government announced it was lifting a driving ban on women, according to one of the friends, who was with Khashoggi at the time. The tone of the call was pleasant, but Khashoggi also told Qahtani he would praise the government when there were “positive developments. When there are bad things, I will speak up.”

He spent the rest of the call advocating on behalf of several recently imprisoned critics of the regime.

A friend also said that Khashoggi told him he had been approached several times by a businessman close to the Saudi ruling family. The businessman, whom Khashoggi did not name, seemed “keen” to see him every time he visited Washington and told Khashoggi that he would work with the Saudi authorities to arrange his return, the friend said.

Kareem Fahim and Loveday Morris in Istanbul and Josh Dawsey, Karoun Demirjian, Karen DeYoung and Carol Morello in Washington contributed to this report.

#####EOF##### Suspected militants detained in Syria attack that killed four Americans - The Washington Post

Suspected militants detained in Syria attack that killed four Americans


Turkish-backed fighters take part in a training on on March 6 north of the Syrian town of Manbij , where four Americans were killed in a suicide bombing in January. (Bakr Alkasem/AFP/Getty Images)

U.S. officials have questioned suspected Islamic State militants who Syrian forces believe have links to a suicide attack that killed four Americans in January, an American official said Tuesday.

The official, speaking on the condition of anonymity because he was not authorized to comment publicly, said the prisoners were being held by the U.S.-backed Syrian Democratic Forces (SDF), a Kurdish-dominated group that has been the chief on-the-ground American partner in Syria.

The official spoke hours after the SDF announced that it had captured suspects in the attack, which targeted Americans posted to Syria’s northern city of Manbij in order to conduct counterterrorism and intelligence operations. It was the single deadliest incident involving U.S. government personnel in the war against the Islamic State.

“A group of suspects believed to be involved in January 16 Manbij bombing that killed several U.S. and SDF servicemen were captured following technical surveillance by our forces,” SDF press official Mustafa Bali said on Twitter.

The Americans killed in the attack were Jonathan Farmer, a Green Beret; Shannon Kent, a Navy cryptologist; Scott Wirtz, a civilian intelligence officer, and Ghadir Taher, a contractor. At least two SDF personnel and eight civilians were also killed, local officials have said.

The news of the suspects’ detention was first reported by Reuters. The official said as many as five suspects were questioned.

It was not immediately clear whether the Trump administration, if the detainees’ suspected links to the attack bear out, would seek to bring the suspects to the United States for prosecution or further detention. The administration has said it would be open to putting new terrorism suspects in the military prison at Guantanamo Bay, Cuba, but that move could pose legal and logistical challenges.

Officials have also considered using federal courts to try Islamic State suspects believed to be involved in the death of Americans.

The detention of those and other Islamic State suspects takes place as the Trump administration prepares to reduce its troop footprint in Syria. While President Trump has pushed for ending U.S. military involvement in Syria, officials have persuaded him to allow a small residual force to remain.

Liz Sly in Beirut and Ellen Nakashima in Washington contributed to this report.

#####EOF##### Transcript of Mark Zuckerberg’s Senate hearing - The Washington Post

Transcript of Mark Zuckerberg’s Senate hearing

Facebook chief executive Mark Zuckerberg appeared before the Senate's Commerce and Judiciary committees Tuesday to discuss data privacy and Russian disinformation on his social network. Below is the transcript of the hearing.

SEN. CHARLES E. GRASSLEY (R-IOWA): The Committees on the Judiciary and Commerce, Science and Transportation will come to order. We welcome everyone to today's hearing on Facebook's social media privacy and the use and abuse of data.

GRASSLEY: Although not unprecedented, this is a unique hearing. The issues we will consider range from data privacy and security to consumer protection and the Federal Trade Commission enforcement touching on jurisdictions of these two committees.

We have 44 members between our two committees. That may not seem like a large group by Facebook standards ...

(LAUGHTER)

... but it is significant here for a hearing in the United States Senate. We will do our best to keep things moving efficiently given our circumstances. We will begin with opening statements from the chairmen and ranking members of each committee, starting with Chairman Thune, and then proceed to Mr. Zuckerberg's opening statement.

We will then move onto questioning. Each member will have five minutes to question witnesses.

I'd like to remind the members of both committees that time limits will be and must be strictly enforced given the numbers that we have here today. If you're over your time, Chairman Thune and I will make sure to let you know. There will not be a second round as well. Of course there will be the usual follow-up written questions for the record. Questioning will alternate between majority and minority and between committees. We will proceed in order based on respective committee seniority.

We will anticipate a couple short breaks later in the afternoon.

And so it's my pleasure to recognize the chairman of the Commerce Committee, Chairman Thune, for his opening statement.

SEN. JOHN THUNE (R-S.D.): Thank you, Chairman Grassley.

Today's hearing is extraordinary. It's extraordinary to hold a joint committee hearing. It's even more extraordinary to have a single CEO testify before nearly half of the United States Senate.

But then, Facebook is pretty extraordinary. More than 2 billion people use Facebook every month. 1.4 billion people use it every day; more than the population of any country on Earth except China, and more than four times the population of the United States. It's also more than 1,500 times the population of my home state of South Dakota.

Plus, roughly 45 percent of American adults report getting at least some of their news from Facebook.

In many respects, Facebook's incredible reach is why we're here today. We're here because of what you, Mr. Zuckerberg, have described as a breach of trust.

A quiz app used by approximately 300,000 people led to information about 87 million Facebook users being obtained by the company Cambridge Analytica.

There are plenty of questions about the behavior of Cambridge Analytica and we expect to hold a future hearing on Cambridge and similar firms. But as you've said, this is not likely to be an isolated incident; a fact demonstrated by Facebook's suspension of another firm just this past weekend.

THUNE: You've promised that when Facebook discovers other apps that had access to large amounts of user data, you will ban them and tell those affected. And that's appropriate, but it's unlikely to be enough for the 2 billion Facebook users.

One reason that so many people are worried about this incident is what it says about how Facebook works. The idea that for every person who decided to try an app, information about nearly 300 other people was scraped from your service is, to put it mildly, disturbing.

And the fact that those 87 million people may have technically consented to making their data available doesn't make those people feel any better.

The recent revelation that malicious actors were able to utilize Facebook's default privacy settings to match email addresses and phone numbers found on the so-called Dark Web to public Facebook profiles potentially affecting all Facebook users only adds fuel to the fire.

What binds these two incidents is that they don't appear to be caused by the kind of negligence that allows typical data breaches to happen. Instead they both appear to be the result of people exploiting the very tools that you created to manipulate users' information.

I know Facebook has taken several steps, and intends to take more, to address these issues. Nevertheless, some have warned that the actions Facebook is taking to ensure that third parties do not obtain data from unsuspecting users, while necessary, will actually serve to enhance Facebook's own ability to market such data exclusively.

Most of us understand that whether you are using Facebook or Google or some other online services, we are trading certain information about ourselves for free or low-cost services. But for this model to persist, both sides of the bargain need to know the stakes that are involved. Right now I am not convinced that Facebook's users have the information that they need to make meaningful choices.

In the past, many of my colleagues on both sides of the aisle have been willing to defer to tech companies' efforts to regulate themselves, but this may be changing.

Just last month, in overwhelming bipartisan fashion, Congress voted to make it easier for prosecutors and victims to go after websites that knowingly facilitate sex trafficking. This should be a wake-up call for the tech community.

We want to hear more, without delay, about what Facebook and other companies plan to do to take greater responsibility for what happens on their platforms.

How will you protect users' data? How will you inform users about the changes that you are making? And how do you intend to proactively stop harmful conduct instead of being forced to respond to it months or years later?

Mr. Zuckerberg, in many ways you and the company that you created, the story that you've created represents the American Dream. Many are incredibly inspired by what you've done.

At the same time, you have an obligation, and it's up to you, to ensure that that dream does not becalm a privacy nightmare for the scores of people who use Facebook.

This hearing is an opportunity to speak to those who believe in Facebook and those who are deeply skeptical about it. We are listening, America is listening and quite possibly the world is listening, too.

GRASSLEY: Thank you.

Now Ranking Member Feinstein.

DIANNE FEINSTEIN (D-CALIF.): Thank you very much, Mr. Chairman.

Chairman Grassley, Chairman Thune, thank you both for holding this hearing.

Mr. Zuckerberg, thank you for being here. You have a real opportunity this afternoon to lead the industry and demonstrate a meaningful commitment to protecting individual privacy.

We have learned over the past few months, and we've learned a great deal that's alarming. We've seen how foreign actors are abusing social media platforms like Facebook to interfere in elections and take millions of Americans' personal information without their knowledge in order to manipulate public opinion and target individual voters.

Specifically, on February the 16th, Special Counsel Mueller issued an indictment against the Russia-based Internet Research Agency and 13 of its employees for interfering (sic) operations targeting the United States.

Through this 37-page indictment, we learned that the IRA ran a coordinated campaign through 470 Facebook accounts and pages. The campaign included ads and false information to create discord and harm Secretary Clinton's campaign, and the content was seen by an estimated 157 million Americans.

A month later, on March 17th, news broke that Cambridge Analytica exploited the personal information of approximately 50 million Facebook users without their knowledge or permission. And, last week, we learned that number was even higher: 87 million Facebook users who had their private information taken without their consent.

Specifically, using a personality quiz he created, Professor Kogan collected the personal information of 300,000 Facebook users, and then collected data on millions of their friends.

It appears the information collected included everything these individuals had on their Facebook pages and, according to some reports, even included private direct messages between users.

Professor Kogan is said to have taken data from over 70 million Americans. It has also been reported that he sold this data to Cambridge Analytica for $800,000 dollars. Cambridge Analytica then took this data and created a psychological warfare tool to influence United States elections.

In fact, the CEO, Alexander Nix, declared that Cambridge Analytica ran all the digital campaign, the television campaign, and its data informed all the strategy for the Trump campaign.

The reporting has also speculated that Cambridge Analytica worked with the Internet Research Agency to help Russia identify which American voters to target, which its — with its propaganda.

I'm concerned that press reports indicate Facebook learned about this breach in 2015, but appears not to have taken significant steps to address it until this year.

So this hearing is important, and I appreciate the conversation we had yesterday. And I believe that Facebook, through your presence here today and the words you're about to tell us, will indicate how strongly your industry will regulate and/or reform the platforms that they control.

FEINSTEIN: I believe this is extraordinarily important. You lead a big company with 27,000 employees, and we very much look forward to your comments.

Thank you, Mr. Chairman.

GRASSLEY: Thank you, Senator Feinstein.

The history and growth of Facebook mirrors that of many of our technological giants. Founded by Mr. Zuckerberg in 2004, Facebook has exploded over the past 14 years. Facebook currently has over 2 billion monthly active users across the world, over 25,000 employees, and offices in 13 U.S. cities and various other countries.

Like their expanding user base, the data collected on Facebook users has also skyrocketed. They have moved on from schools, likes and relationship statuses. Today, Facebook has access of data points, ranging from ads that you've clicked on, events you've attended and your location, based upon your mobile device.

It is no secret that Facebook makes money off this data through advertising revenue, although many seem confused by or altogether unaware of this fact. Facebook generates — generated $40 billion in revenue in 2017, with about 98 percent coming from advertising across Facebook and Instagram.

Significant data collection is also occurring at Google, Twitter, Apple, and Amazon. And even — an ever-expanding portfolio of products and services offered by these companies grant endless opportunities to collect increasing amounts of information on their customers.

As we get more free or extremely low-cost services, the trade-off for the American consumer is to provide more personal data. The potential for further growth and innovation based on collection of data is unlimitedless. However, the potential for abuse is also significant.

While the contours of the Cambridge Analytica situation are still coming to light, there was clearly a breach of consumer trust and a likely improper transfer of data. The Judiciary Committee will hold a separate hearing exploring Cambridge and other data privacy issues.

More importantly, though, these events have ignited a larger discussion on consumers' expectations and the future of data privacy in our society. It has exposed that consumers may not fully understand or appreciate the extent to which their data is collected, protected, transferred, used and misused.

Data has been used in advertising and political campaigns for decades. The amount and type of data obtained, however, has seen a very dramatic change. Campaigns including Presidents Bush, Obama and Trump all use these increasing amounts of data to focus on microtargeting and personalization over numerous social media platforms, and especially Facebook.

In fact, Presidents — Obama's campaign developed an app utilizing the same Facebook feature as Cambridge Analytica to capture the information of not just the app's users, but millions of their friends.

GRASSLEY: The digital director for that campaign for 2012 described the data-scraping app as something that would, quote, “wind up being the most groundbreaking piece of technology developed for this campaign,” end of quote.

So the effectiveness of these social media tactics can be debated. But their use over the past years, across the political spectrum, and their increased significance cannot be ignored. Our policy towards data privacy and security must keep pace with these changes.

Data privacy should be tethered to consumer needs and expectations. Now, at a minimum, consumers must have the transparency necessary to make an informed decision about whether to share their data and how it can be used.

Consumers ought to have clearer information, not opaque policies and complex click-through consent pages. The tech industry has an obligation to respond to widespread and growing concerns over data privacy and security and to restore the public's trust.

The status quo no longer works. Moreover, Congress must determine if and how we need to strengthen privacy standards to ensure transparency and understanding for the billions of consumers who utilize these products.

Senator Nelson.

BILL NELSON (D-FLA.): Thank you, Mr. Chairman. Mr. Zuckerberg, good afternoon.

Let me just cut to the chase. If you and other social media companies do not get your act in order, none of us are going to have any privacy anymore. That's what we're facing.

We're talking about personally identifiable information that, if not kept by the social media — media companies from theft, a value that we have in America, being our personal privacy — we won't have it anymore. It's the advent of technology.

And, of course, all of us are part of it. From the moment that we wake up in the morning, until we go to bed, we're on those handheld tablets. And online companies like Facebook are tracking our activities and collecting information.

Facebook has a responsibility to protect this personal information. We had a good discussion yesterday. We went over all of this. You told me that the company had failed to do so.

It's not the first time that Facebook has mishandled its users' information. The FTC found that Facebook's privacy policies had deceived users in the past. And, in the present case, we recognize that Cambridge Analytica and an app developer lied to consumers and lied to you, lied to Facebook.

But did Facebook watch over the operations? We want to know that. And why didn't Facebook notify 87 million users that their personally identifiable information had been taken, and it was being also used — why were they not informed — for unauthorized political purposes?

NELSON: So, only now — and I appreciate our conversation — only now, Facebook has pledged to inform those consumers whose accounts were compromised.

I think you are genuine. I got that sense in conversing with you. You want to do the right thing. You want to enact reforms. We want to know if it's going to be enough. And I hope that will be the in the answers today.

Now, since we still don't know what Cambridge Analytica has done with this data, you heard Chairman Thune say, as we have discussed, we want to haul Cambridge Analytica in to answer these questions at a separate hearing.

I want to thank Chairman Thune for working with all of us on scheduling a hearing. There's obviously a great deal of interest in this subject. I hope we can get to the bottom of this. And, if Facebook and other online companies will not or cannot fix the privacy invasions, then we are going to have to — we, the Congress.

How can American consumers trust folks like your company to be caretakers of their most personal and identifiable information? And that's the question.

Thank you.

GRASSLEY: Thank you, my colleagues and Senator Nelson.

Our witness today is Mark Zuckerberg, founder, chairman, chief executive officer of Facebook. Mr. Zuckerberg launched Facebook February 4th, 2004, at the age of 19. And, at that time, he was a student at Harvard University.

As I mentioned previously, his company now has over $40 billion of annual revenue and over 2 billion, monthly, active users. Mr. Zuckerberg, along with his wife, also established the Chan Zuckerberg Initiative to further philanthropy causes.

I now turn to you. Welcome to the committee, and, whatever your statement is orally — if you have a longer one, it'll be included in the record. So, proceed, sir.

MARK ZUCKERBERG: Chairman Grassley, Chairman Thune, Ranking Member Feinstein, Ranking Member Nelson and members of the committee, we face a number of important issues around privacy, safety and democracy. And you will rightfully have some hard questions for me to answer. Before I talk about the steps we're taking to address them, I want to talk about how we got here.

Facebook is an idealistic and optimistic company. For most of our existence, we focused on all of the good that connecting people can do. And, as Facebook has grown, people everywhere have gotten a powerful new tool for staying connected to the people they love, for making their voices heard and for building communities and businesses.

Just recently, we've seen the “Me Too” movement and the March for our Lives organized, at least in part, on Facebook. After Hurricane Harvey, people came together to raise more than $20 million for relief. And more than 70 million businesses — small business use Facebook to create jobs and grow.

But it's clear now that we didn't do enough to prevent these tools from being used for harm, as well. And that goes for fake news, for foreign interference in elections, and hate speech, as well as developers and data privacy.

ZUCKERBERG: We didn't take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake. And I'm sorry. I started Facebook, I run it, and I'm responsible for what happens here.

So, now, we have to go through our — all of our relationship with people and make sure that we're taking a broad enough view of our responsibility.

It's not enough to just connect people. We have to make sure that those connections are positive. It's not enough to just give people a voice. We need to make sure that people aren't using it to harm other people or to spread misinformation. And it's not enough to just give people control over their information. We need to make sure that the developers they share it with protect their information, too.

Across the board, we have a responsibility to not just build tools, but to make sure that they're used for good. It will take some time to work through all the changes we need to make across the company, but I'm committed to getting this right. This includes the basic responsibility of protecting people's information, which we failed to do with Cambridge Analytica.

So here are a few things that we are doing to address this and to prevent it from happening again.

First, we're getting to the bottom of exactly what Cambridge Analytica did, and telling everyone affected. What we know now is that Cambridge Analytica improperly accessed some information about millions of Facebook members by buying it from an app developer.

That information — this was information that people generally share publicly on their Facebook pages, like names and their profile picture and the pages they follow.

When we first contacted Cambridge Analytica, they told us that they had deleted the data. About a month ago, we heard new reports that suggested that wasn't true. And, now, we're working with governments in the U.S., the U.K. and around the world to do a full audit of what they've done and to make sure they get rid of any data they may still have.

Second, to make sure no other app developers out there are misusing data, we're now investigating every single app that had access to a large amount of information in the past. And, if we find that someone improperly used data, we're going to ban them from Facebook and tell everyone affected.

Third, to prevent this from ever happening again, going forward, we're making sure that developers can't access as much information now. The good news here is that we already made big changes to our platform in 2014 that would have prevented this specific situation with Cambridge Analytica from occurring again today.

But there's more to do, and you can find more details on the steps we're taking in my written statement.

My top priority has always been our social mission of connecting people, building community and bringing the world closer together. Advertisers and developers will never take priority over that, as long as I am running Facebook.

I started Facebook when I was in college. We've come a long way since then. We now serve more than 2 billion people around the world. And, every day, people use our services to stay connected with the people that matter to them most.

I believe deeply in what we are doing. And I know that, when we address these challenges we'll look back and view helping people connect and giving more people a voice as a positive force in the world.

I realize the issues we're talking about today aren't just issues for Facebook and our community. They're issues and challenges for all of us as Americans.

Thank you for having me here today, and I'm ready to take your questions.

GRASSLEY: I'll remind members that, maybe, weren't here when I had my opening comments that we are operating under the five-year — the five-minute rule. And that applies to ...

(LAUGHTER)

... the five-minute rule. And that applies to those of us who are chairing the committee, as well.

GRASSLEY: I'll start with you.

Facebook handles extensive amounts of personal data for billions of users. A significant amount of that data is shared with third-party developers, who utilize your platform.

As of this — early this year, you did not actively monitor whether that data was transferred by such developers to other parties. Moreover, your policies only prohibit transfers by developers to parties seeking to profit from such data.

Number one, besides Professor Kogan's transfer and now, potentially, Cubeyou, do you know of any instances where user data was improperly transferred to third party in breach of Facebook's terms? If so, how many times has that happened, and was Facebook only made aware of that transfer by some third party?

ZUCKERBERG: Mr. Chairman, thank you.

As I mentioned, we're now conducting a full investigation into every single app that had a — access to a large amount of information, before we locked down platform to prevent developers from accessing this information around 2014.

We believe that we're going to be investigating many apps, tens of thousands of apps. And, if we find any suspicious activity, we're going to conduct a full audit of those apps to understand how they're using their data and if they're doing anything improper. If we find that they're doing anything improper, we'll ban them from Facebook and we will tell everyone affected.

As for past activity, I don't have all the examples of apps that we've banned here, but if you would like, I can have my team follow up with you after this.

GRASSLEY: Okay.

Have you ever required an audit to ensure the deletion of improperly transferred data? And, if so, how many times?

ZUCKERBERG: Mr. Chairman, yes we have. I don't have the exact figure on how many times we have. But, overall, the way we've enforced our platform policies in the past is we have looked at patterns of how apps have used our APIs and accessed information, as well as looked into reports that people have made to us about apps that might be doing sketchy things.

Going forward, we're going to take a more proactive position on this and do much more regular stock checks and other reviews of apps, as well as increasing the amount of audits that we do. And, again, I can make sure that our team follows up with you on anything about the specific past stats that would be interesting.

GRASSLEY: I was going to assume that, sitting here today, you have no idea — and if I'm wrong on that, that you're able — you were telling me, I think, that you're able to supply those figures to us, at least as of this point.

ZUCKERBERG: Mr. Chairman, I will have my team follow up with you on what information we have.

GRASSLEY: Okay but, right now, you have no certainty of whether or not — how much of that's going on, right? Okay.

Facebook collects massive amounts of data from consumers, including content, networks, contact lists, device information, location, and information from third parties, yet your data policy is only a few pages long and provides consumers with only a few examples of what is collected and how it might be used.

The examples given emphasize benign uses, such as “connecting with friends,” but your policy does not give any indication for more controversial issues of such data.

My question: Why doesn't Facebook disclose to its users all the ways that data might be used by Facebook and other third parties? And what is Facebook's responsibility to inform users about that information?

ZUCKERBERG: Mr. Chairman, I believe it's important to tell people exactly how the information that they share on Facebook is going to be used. That's why, every single time you go to share something on Facebook, whether it's a photo in Facebook, or a message — in Messenger or What's App, every single time, there's a control right there about who you're going to be sharing it with — whether it's your friends or public or a specific group — and you can — you can change that and control that in line.

To your broader point about the privacy policy, this gets into an — an issue that I — I think we and others in the tech industry have found challenging, which is that long privacy policies are very confusing. And if you make it long and spell out all the detail, then you're probably going to reduce the percent of people who read it and make it accessible to them.

So, one of the things that — that we've struggled with over time is to make something that is as simple as possible so people can understand it, as well as giving them controls in line in the product in the context of when they're trying to actually use them, taking into account that we don't expect that most people will want to go through and read a full legal document.

GRASSLEY: Senator Nelson?

NELSON: Thank you, Mr. Chairman.

Yesterday when we talked, I gave the relatively harmless example that I'm communicating with my friends on Facebook and indicate that I love a certain kind of chocolate. And all of a sudden I start receiving advertisements for chocolate. What if I don't want to receive those commercial advertisements?

So your chief operating officer, Ms. Sandberg, suggested on the NBC “Today Show” that Facebook users who do not want their personal information used for advertising might have to pay for that protection. Pay for it.

Are you actually considering having Facebook users pay for you not to use the information?

ZUCKERBERG: Senator, people have a control over how their information is used in ads in the product today. So if you want to have an experience where your ads aren't — aren't targeted using all the information that we have available, you can turn off third-party information.

What we found is that even though some people don't like ads, people really don't like ads that aren't relevant. And while there is some discomfort for sure with using information in making ads more relevant, the overwhelming feedback that we get from our community is that people would rather have us show relevant content there than not.

So we offer this control that — that you're referencing. Some people use it. It's not the majority of people on Facebook. And — and I think that that's — that's a good level of control to offer.

I think what Sheryl was saying was that, in order to not run ads at all, we would still need some sort of business model.

NELSON: And that is your business model. So I take it that — and I used the harmless example of chocolate. But if it got into more personal thing, communicating with friends, and I want to cut it off, I'm going to have to pay you in order not to send me, using my personal information, something that I don't want. That in essence is what I understood Ms. Sandberg to say. Is that correct?

ZUCKERBERG: Yes, senator.

Although to be clear, we don't offer an option today for people to pay to not show ads. We think offering an ad-supported service is the most aligned with our mission of trying to help connect everyone in the world, because we want to offer a free service that everyone can afford.

NELSON: Okay.

ZUCKERBERG: That's the only way that we can reach billions of people.

NELSON: But — so, therefore, you consider my personally identifiable data the company's data, not my data. Is that it?

ZUCKERBERG: No, senator. Actually, at — the first line of our Terms of Service say that you control and own the information and content that you put on Facebook.

NELSON: Well, the recent scandal is obviously frustrating, not only because it affected 87 million, but because it seems to be part of a pattern of lax data practices by the company, going back years.

So, back in 2011, it was a settlement with the FTC. And, now, we discover yet another incidence where the data was failed to be protected. When you discovered that Cambridge Analytica — that had fraudulently obtained all of this information, why didn't you inform those 87 million?

ZUCKERBERG: When we learned in 2015 that Cambridge Analytica had bought data from an app developer on Facebook that people had shared it with, we did take action.

We took down the app, and we demanded that both the app developer and Cambridge Analytica delete and stop using any data that they had. They told us that they did this. In retrospect, it was clearly a mistake to believe them ...

NELSON: Yes.

ZUCKERBERG: ... and we should have followed up and done a full audit then. And that is not a mistake that we will make.

NELSON: Yes, you did that, and you apologized for it. But you didn't notify them. And do you think that you have an ethical obligation to notify 87 million Facebook users?

ZUCKERBERG: Senator, when we heard back from Cambridge Analytica that they had told us that they weren't using the data and had deleted it, we considered it a closed case. In retrospect, that was clearly a mistake.

We shouldn't have taken their word for it, and we've updated our policies and how we're going to operate the company to make sure that we don't make that mistake again.

NELSON: Did anybody notify the FTC?

ZUCKERBERG: No, senator, for the same reason — that we'd considered it a closed — a closed case.

GRASSLEY: Senator Thune.

THUNE: And — and, Mr. Zuckerberg, would you that — do that differently today, presumably? That — in response to Senator Nelson's question ...

ZUCKERBERG: Yes.

THUNE: ... having to do it over?

This may be your first appearance before Congress, but it's not the first time that Facebook has faced tough questions about its privacy policies. Wired Magazine recently noted that you have a 14-year history of apologizing for ill-advised decisions regarding user privacy, not unlike the one that you made just now in your opening statement.

After more than a decade of promises to do better, how is today's apology different? And why should we trust Facebook to make the necessary changes to ensure user privacy and give people a clearer picture of your privacy policies?

ZUCKERBERG: Thank you, Mr. Chairman. So we have made a lot of mistakes in running the company. I think it's — it's pretty much impossible, I — I believe, to start a company in your dorm room and then grow it to be at the scale that we're at now without making some mistakes.

And, because our service is about helping people connect and information, those mistakes have been different in — in how they — we try not to make the same mistake multiple times. But in general, a lot of the mistakes are around how people connect to each other, just because of the nature of the service.

ZUCKERBERG: Overall, I would say that we're going through a broader philosophical shift in how we approach our responsibility as a company. For the first 10 or 12 years of the company, I viewed our responsibility as primarily building tools that, if we could put those tools in people's hands, then that would empower people to do good things.

What I think we've learned now across a number of issues — not just data privacy, but also fake news and foreign interference in elections — is that we need to take a more proactive role and a broader view of our responsibility.

It's not enough to just build tools. We need to make sure that they're used for good. And that means that we need to now take a more active view in policing the ecosystem and in watching and kind of looking out and making sure that all of the members in our community are using these tools in a way that's going to be good and healthy.

So, at the end of the day, this is going to be something where people will measure us by our results on this. It's not that I expect anything that I say here today — to necessarily change people's view.

But I'm committed to getting this right. And I believe that, over the coming years, once we fully work all these solutions through, people will see real differences.

THUNE: Well — and I'm glad that you all have gotten that message.

As we discussed in my office yesterday, the line between legitimate political discourse and hate speech can sometimes be hard to identify, and especially when you're relying on artificial intelligence and other technologies for the initial discovery.

Can you discuss what steps that Facebook currently takes when making these evaluations, the challenges that you face and any examples of where you may draw the line between what is and what is not hate speech?

ZUCKERBERG: Yes, Mr. Chairman. I'll speak to hate speech, and then I'll talk about enforcing our content policies more broadly. So — actually, maybe, if — if you're okay with it, I'll go in the other order.

So, from the beginning of the company in 2004 — I started in my dorm room; it was me and my roommate. We didn't have A.I. technology that could look at the content that people were sharing. So — so we basically had to enforce our content policies reactively.

People could share what they wanted, and then, if someone in the community found it to be offensive or against our policies, they'd flag it for us, and we'd look at it reactively. Now, increasingly, we're developing A.I. tools that can identify certain classes of bad activity proactively and flag it for our team at Facebook.

By the end of this year, by the way, we're going to have more than 20,000 people working on security and content review, working across all these things. So, when content gets flagged to us, we have those — those people look at it. And, if it violates our policies, then we take it down.

Some problems lend themselves more easily to A.I. solutions than others. So hate speech is one of the hardest, because determining if something is hate speech is very linguistically nuanced, right?

It's — you need to understand, you know, what is a slur and what — whether something is hateful not just in English, but the majority of people on Facebook use it in languages that are different across the world.

Contrast that, for example, with an area like finding terrorist propaganda, which we've actually been very successful at deploying A.I. tools on already.

Today, as we sit here, 99 percent of the ISIS and Al Qaida content that we take down on Facebook, our A.I. systems flag before any human sees it. So that's a success in terms of rolling out A.I. tools that can proactively police and enforce safety across the community.

Hate speech — I am optimistic that, over a 5 to 10-year period, we will have A.I. tools that can get into some of the nuances — the linguistic nuances of different types of content to be more accurate in flagging things for our systems.

But, today, we're just not there on that. So a lot of this is still reactive. People flag it to us. We have people look at it. We have policies to try to make it as not subjective as possible. But, until we get it more automated, there is a higher error rate than I'm happy with.

THUNE: Thank you ...

(CROSSTALK)

GRASSLEY: Senator Feinstein?

FEINSTEIN: Thanks, Mr. Chairman.

Mr. Zuckerberg, what is Facebook doing to prevent foreign actors from interfering in U.S. elections?

ZUCKERBERG: Thank you, senator.

This is one of my top priorities in 2018 — is to get this right. I — one of my greatest regrets in running the company is that we were slow in identifying the Russian information operations in 2016. We expected them to do a number of more traditional cyber attacks, which we did identify and notify the campaigns that they were trying to hack into them.

But we were slow at identifying the type of — of new information operations.

FEINSTEIN: When did you identify new operations?

ZUCKERBERG: It was right around the time of the 2016 election itself. So, since then, we — 2018 is — is an incredibly important year for elections. Not just in — with the U.S. midterms, but, around the world, there are important elections — in India, in Brazil, in Mexico, in Pakistan and in Hungary, that — we want to make sure that we do everything we can to protect the integrity of those elections.

Now, I have more confidence that we're going to get this right, because, since the 2016 election, there have been several important elections around the world where we've had a better record. There was the French presidential election. There was the German election. There was the U.S. Senate Alabama special election last year.

FEINSTEIN: Explain what is better about the record.

ZUCKERBERG: So we've deployed new A.I. tools that do a better job of identifying fake accounts that may be trying to interfere in elections or spread misinformation. And, between those three elections, we were able to proactively remove tens of thousands of accounts that — before they — they could contribute significant harm.

And the nature of these attacks, though, is that, you know, there are people in Russia whose job it is — is to try to exploit our systems and other Internet systems, and other systems, as well.

So this is an arms race, right? I mean, they're going to keep on getting better at this, and we need to invest in keeping on getting better at this, too, which is why one of things I mentioned before is we're going to have more than 20,000 people, by the end of this year, working on security and content review across the company.

FEINSTEIN: Speak for a moment about automated bots that spread disinformation. What are you doing to punish those who exploit your platform in that regard?

ZUCKERBERG: Well, you're not allowed to have a fake account on Facebook. Your content has to be authentic. So we build technical tools to try to identify when people are creating fake accounts — especially large networks of fake accounts, like the Russians have — in order to remove all of that content.

After the 2016 election, our top priority was protecting the integrity of other elections around the world. But, at the same time, we had a parallel effort to trace back to Russia the IRA activity — the Internet Research Agency activity that was — the part of the Russian government that — that did this activity in — in 2016.

And, just last week, we were able to determine that a number of Russian media organizations that were sanctioned by the Russian regulator were operated and controlled by this Internet Research Agency.

So we took the step last week — that was a pretty big step for us — of taking down sanctioned news organizations in Russia as part of an operation to remove 270 fake accounts and pages, part of their broader network in Russia, that was — that was actually not targeting international interference as much as — sorry, let me correct that.

It was primarily targeting — spreading misinformation in Russia itself, as well as certain Russian-speaking neighboring countries.

FEINSTEIN: How many accounts of this type have you taken down?

ZUCKERBERG: Across — in the IRA specifically, the ones that we've pegged back to the IRA, we can identify the 470 in the American elections in the 270 that we specifically went after in Russia last week.

There were many others that our systems catch, which are more difficult to attribute specifically to Russian intelligence, but the number would be in the tens of thousands of fake accounts that we remove. And I'm happy to have my team follow up with you on more information, if that would be helpful.

FEINSTEIN: Would you, please? I think this is very important.

If you knew in 2015 that Cambridge Analytica was using the information of Professor Kogan's, why didn't Facebook ban Cambridge in 2015? Why'd you wait another ...

(CROSSTALK)

ZUCKERBERG: Senator, that's a — a great question.

Cambridge Analytica wasn't using our services in 2015, as far as we can tell. So this is — this is clearly one of the questions that I asked our team, as soon as I learned about this — is why — why did we wait until we found out about the reports last month to — to ban them.

It's because, as of the time that we learned about their activity in 2015, they weren't an advertiser. They weren't running pages. So we actually had nothing to ban.

FEINSTEIN: Thank you.

Thank you, Mr. Chairman.

GRASSLEY: No, thank you, Senator Feinstein.

Now, Senator Hatch.

SEN. ORRIN G. HATCH (R-UTAH): Well, in my opinion, this is the most — this is the most intense public scrutiny I've seen for a tech-related hearing since the Microsoft hearing that — that I chaired back in the late 1990s.

The recent stories about Cambridge Analytica and data mining on social media have raised serious concerns about consumer privacy, and, naturally, I know you understand that.

At the same time, these stories touch on the very foundation of the Internet economy and the way the websites that drive our Internet economy make money. Some have professed themselves shocked — shocked that companies like Facebook and Google share user data with advertisers.

Did any of these individuals ever stop to ask themselves why Facebook and Google didn't — don't change — don't charge for access? Nothing in life is free. Everything involves trade-offs.

If you want something without having to pay money for it, you're going to have to pay for it in some other way, it seems to me. And that's where — what we're seeing here.

And these great websites that don't charge for access — they extract value in some other way. And there's nothing wrong with that, as long as they're upfront about what they're doing.

To my mind, the issue here is transparency. It's consumer choice. Do users understand what they're agreeing to — to when they access a website or agree to terms of service? Are websites upfront about how they extract value from users, or do they hide the ball?

Do consumers have the information they need to make an informed choice regarding whether or not to visit a particular website? To my — to my mind, these are questions that we should ask or be focusing on.

Now, Mr. Zuckerberg, I remember well your first visit to Capitol Hill, back in 2010. You spoke to the Senate Republican High-Tech Task Force, which I chair. You said back then that Facebook would always be free.

Is that still your objective?

ZUCKERBERG: Senator, yes. There will always be a version of Facebook that is free. It is our mission to try to help connect everyone around the world and to bring the world closer together.

In order to do that, we believe that we need to offer a service that everyone can afford, and we're committed to doing that.

HATCH: Well, if so, how do you sustain a business model in which users don't pay for your service?

ZUCKERBERG: Senator, we run ads.

HATCH: I see. That's great. Whenever a controversy like this arises, there's always the danger that Congress's response will be to step and overregulate. Now, that's been the experience that I've had, in my 42 years here.

In your view, what sorts of legislative changes would help to solve the problems the Cambridge Analytica story has revealed? And what sorts of legislative changes would not help to solve this issue?

ZUCKERBERG: Senator, I think that there are a few categories of legislation that — that make sense to consider.

Around privacy specifically, there are a few principles that I think it would be useful to — to discuss and potentially codified into law.

One is around having a simple and practical set of — of ways that you explain what you are doing with data. And we talked a little bit earlier around the complexity of laying out these long privacy policies. It's hard to say that people fully understand something when it's only written out in a long legal document. This needs — the stuff needs to be implemented in a way where people can actually understand it, where consumers can — can understand it, but that can also capture all the nuances of how these services work in a way that doesn't — that's not overly restrictive on — on providing the services. That's one.

The second is around giving people complete control. This is the most important principle for Facebook: Every piece of content that you share on Facebook, you own and you have complete control over who sees it and — and how you share it, and you can remove it at any time.

That's why every day, about 100 billion times a day, people come to one of our services and either post a photo or send a message to someone, because they know that they have that control and that who they say it's going to go to is going to be who sees the content.

And I think that that control is something that's important that I think should apply to — to every service.

And the third point is — is just around enabling innovation. Because some of the abuse cases that — that are very sensitive, like face recognition, for example — and I feel there's a balance that's extremely important to strike here, where you obtain special consent for sensitive features like face recognition, but don't — but we still need to make it so that American companies can innovate in those areas, or else we're going to fall behind Chinese competitors and others around the world who have different regimes for — for different new features like that.

GRASSLEY: Senator Cantwell?

SEN. MARIA CANTWELL (D-WASH): Thank you, Mr. Chairman.

Welcome Mr. Zuckerberg.

Do you know who Palantir is?

ZUCKERBERG: I do.

CANTWELL: Some people refer to them as a Stanford Analytica. Do you agree?

ZUCKERBERG: Senator, I have not heard that.

CANTWELL: Okay.

Do you think Palantir taught Cambridge Analytica, as press reports are saying, how to do these tactics?

ZUCKERBERG: Senator, I do not know.

CANTWELL: Do you think that Palantir has ever scraped data from Facebook?

ZUCKERBERG: Senator, I'm not aware of that.

CANTWELL: Do you think that during the 2016 campaign, as Cambridge Analytica was providing support to the Trump campaign under Project Alamo, were there any Facebook people involved in that sharing of technique and information?

ZUCKERBERG: Senator, we provided support to the Trump campaign similar to what we provide to any advertiser or campaign who asks for it.

CANTWELL: So that was a yes. Was that a yes?

ZUCKERBERG: Senator, can you repeat the specific question? I just want to make sure I get specifically what you're asking.

CANTWELL: During the 2016 campaign, Cambridge Analytica worked with the Trump campaign to refine tactics. And were Facebook employees involved in that?

ZUCKERBERG: Senator, I don't know that our employees were involved with Cambridge Analytica. Although I know that we did help out the Trump campaign overall in sales support in the same way that we do with other companies.

CANTWELL: So they may have been involved and all working together during that time period? Maybe that's something your investigation will find out.

ZUCKERBERG: Senator, my — I can certainly have my team get back to you on any specifics there that I don't know, sitting here today.

CANTWELL: Have you heard of Total Information Awareness? Do you know what I'm talking about?

ZUCKERBERG: No, I do not.

CANTWELL: Okay. Total Information Awareness was, 2003, John Ashcroft and others trying to do similar things to what I think is behind all of this — geopolitical forces trying to get data and information to influence a process.

So, when I look at Palantir and what they're doing; and I look at WhatsApp, which is another acquisition; and I look at where you are, from the 2011 consent decree, and where you are today; I am thinking, “Is this guy outfoxing the foxes? Or is he going along with what is a major trend in an information age, to try to harvest information for political forces?”

And so my question to you is, do you see that those applications, that those companies — Palantir and even WhatsApp — are going to fall into the same situation that you've just fallen into, over the last several years?

ZUCKERBERG: Senator, I'm not — I'm not sure, specifically. Overall, I — I do think that these issues around information access are challenging.

To the specifics about those apps, I'm not really that familiar with what Palantir does. WhatsApp collects very little information and, I — I think, is less likely to have the kind of issues because of the way that the service is architected. But, certainly, I think that these are broad issues across the tech industry.

CANTWELL: Well, I guess, given the track record — where Facebook is and why you're here today, I guess people would say that they didn't act boldly enough.

And the fact that people like John Bolton, basically, was an investor — in a New York Times article earlier — I guess it was actually last month — that the Bolton PAC was obsessed with how America was becoming limp-wristed and spineless, and it wanted research and messaging for national security issues.

So the fact that, you know, there are a lot of people who are interested in this larger effort — and what I think my constituents want to know is, was this discussed at your board meetings? And what are the applications and interests that are being discussed without putting real teeth into this?

We don't want to come back to this situation again. I believe you have all the talent. My question is whether you have all the will to help us solve this problem.

ZUCKERBERG: Yes, Senator.

So data privacy and foreign interference in elections are certainly topics that we have discussed at the board meeting. These are some of the biggest issues that the company has faced, and we feel a huge responsibility to get these right.

CANTWELL: Do you believe European regulations should be applied here in the U.S.?

ZUCKERBERG: Senator, I think everyone in the world deserves good privacy protection. And, regardless of whether we implement the exact same regulation, I would guess that it would be somewhat different, because we have somewhat different sensibilities in the U.S. as to other countries.

We're committed to rolling out the controls and the affirmative consent and the special controls around sensitive types of technology, like face recognition, that are required in GDPR. We're doing that around the world.

So I think it's certainly worth discussing whether we should have something similar in the U.S. But what I would like to say today is that we're going to go forward and implement that, regardless of what the regulatory outcome is.

GRASSLEY: Senator Wicker?

Senator Thune will chair next.

Senator Wicker?

SEN. ROGER WICKER (R-MISS): Thank you, Mr. Chairman.

And, Mr. Zuckerberg, thank you for being with us.

My question is going to be, sort of, a follow-up on what Senator Hatch was talking about. And let me agree with basically his — his advice, that we don't want to overregulate (inaudible) to the point where we're stifling innovation and investment.

I understand with regard to suggested rules or suggested legislation, there are at least two schools of thought out there.

One would be the ISPs, the Internet service providers, who are advocating for privacy protections for consumers that apply to all online entities equally across the entire Internet ecosystem.

Now, Facebook is an edge provider on the other hand. It is my understanding that many edge providers, such as Facebook, may not support that effort, because edge providers have different business models than the ISPs and should not be considered like services.

So, do you think we need consistent privacy protections for consumers across the entire Internet ecosystem that are based on the type of consumer information being collected, used or shared, regardless of the entity doing the collecting, reusing or sharing?

ZUCKERBERG: Senator, this is an important question.

I would differentiate between ISPs, which I consider to be the pipes of the Internet, and the platforms like Facebook or Google or Twitter, YouTube that are the apps or platforms on top of that.

I think in general, the expectations that people have of the pipes are somewhat different from the platforms. So there might be areas where there needs to be more regulation in one and less in the other, but I think that there are going to be other places where there needs to be more regulation of the other type.

Specifically, though, on the pipes, one of the important issues that — that I think we face and have debated is ...

WICKER: When you — when you say “pipes,” you mean ...

ZUCKERBERG: ISPs.

WICKER: ... the ISPs.

ZUCKERBERG: Yeah.

So I know net neutrality has been a — a hotly debated topic, and one of the reasons why I have been out there saying that I think that should be the case is because, you know, I look at my own story of when I was getting started building Facebook at Harvard, you know, I only had one option for an ISP to use. And if I had to pay extra in order to make it so that my app could potentially be seen or used by other people, then — then we probably wouldn't be here today.

WICKER: Okay, well — but we're talking about privacy concerns. And let me just say, we'll — we'll have to follow up on this. But I think you and I agree, this is going to be one of the major items of debate if we have to go forward and — and do this from a governmental standpoint.

Let me just move on to another couple of items.

Is it true that — as was recently publicized, that Facebook collects the call and text histories of its users that use Android phones?

ZUCKERBERG: Senator, we have an app called Messenger for sending messages to your Facebook friends. And that app offers people an option to sync their — their text messages into the messenging app, and to make it so that — so basically so you can have one app where it has both your texts and — and your Facebook messages in one place.

We also allow people the option of ...

WICKER: You can opt in or out of that?

ZUCKERBERG: Yes. It is opt-in.

WICKER: It is easy to opt out?

ZUCKERBERG: It is opt-in. You — you have to affirmatively say that you want to sync that information before we get access to it.

WICKER: Unless you — unless you opt in, you don't collect that call and text history?

ZUCKERBERG: That is correct.

WICKER: And is that true for — is this practice done at all with minors, or do you make an exception there for persons aged 13 to 17?

ZUCKERBERG: I do not know. We can follow up with that.

WICKER: Okay, do that — let's do that.

One other thing: There have been reports that Facebook can track a user's Internet browsing activity, even after that user has logged off of the Facebook platform. Can you confirm whether or not this is true?

ZUCKERBERG: Senator — I — I want to make sure I get this accurate, so it would probably be better to have my team follow up afterwards.

WICKER: You don't know?

ZUCKERBERG: I know that the — people use cookies on the Internet, and that you can probably correlate activity between — between sessions.

We do that for a number of reasons, including security, and including measuring ads to make sure that the ad experiences are the most effective, which, of course, people can opt out of. But I want to make sure that I'm precise in my answer, so let me ...

WICKER: When — well, when you get ...

ZUCKERBERG: ... follow up with you on that.

WICKER: ... when you get back to me, sir, would you also let us know how Facebook's — discloses to its users that engaging in this type of tracking gives us that result?

ZUCKERBERG: Yes.

WICKER: And thank you very much.

GRASSLEY: Thank you, Senator Wicker.

Senator Leahy's up next.

SEN. PATRICK J. LEAHY (D-VT): Thank you.

Mr. Zuckerberg, I — I assume Facebook's been served with subpoenas from the — Special Counsel Mueller's office. Is that correct?

ZUCKERBERG: Yes.

LEAHY: Have you or anyone at Facebook been interviewed by the Special Counsel's Office?

ZUCKERBERG: Yes.

LEAHY: Have you been interviewed ...

ZUCKERBERG: I have not. I — I have not.

LEAHY: Others have?

ZUCKERBERG: I — I believe so. And I want to be careful here, because that — our work with the special counsel is confidential, and I want to make sure that, in an open session, I'm not revealing something that's confidential.

LEAHY: I understand. I just want to make clear that you have been contacted, you have had subpoenas.

ZUCKERBERG: Actually, let me clarify that. I actually am not aware of — of a subpoena. I believe that there may be, but I know we're working with them.

LEAHY: Thank you.

Six months ago, your general counsel promised us that you were taking steps to prevent Facebook preserving what I would call an unwitting co-conspirator in Russian interference.

But these — these unverified, divisive pages are on Facebook today. They look a lot like the anonymous groups that Russian agents used to spread propaganda during the 2016 election.

Are you able to confirm whether they're Russian-created groups? Yes or no?

ZUCKERBERG: Senator, are you asking about those specifically?

LEAHY: Yes.

ZUCKERBERG: Senator, last week, we actually announced a major change to our ads and pages policies: that we will be identifying the identity of every single advertiser ...

LEAHY: I'm asking about specific ones. Do you know whether they are?

ZUCKERBERG: I am not familiar with those pieces of content specifically.

LEAHY: But, if you decided this policy a week ago, you'd be able to verify them?

ZUCKERBERG: We are working on that now. What we're doing is we're going to verify the identity of any advertiser who's running a political or issue-related ad — this is basically what the Honest Ads Act is proposing, and we're following that.

And we're also going to do that for pages. So ...

LEAHY: But you can't answer on these?

ZUCKERBERG: I — I'm not familiar with those specific cases.

LEAHY: Well, will you — will you find out the answer and get back to me?

ZUCKERBERG: I'll have my team get back to you.

I do think it's worth adding, though, that we're going to do the same verification of identity and location of admins who are running large pages.

So, that way, even if they aren't going to be buying ads in our system, that will make it significantly harder for Russian interference efforts or other inauthentic efforts ...

LEAHY: Well, some ...

ZUCKERBERG: ... to try to spread misinformation through the network.

LEAHY: ... it's a fight that's been going on for some time, so I might say it's about time.

You know, six months ago, I asked your general counsel about Facebook's role as a breeding ground for hate speech against Rohingya refugees. Recently, U.N. investigators blamed Facebook for playing a role in inciting possible genocide in Myanmar. And there has been genocide there.

You say you use A.I. to find this. This is the type of content I'm referring to. It calls for the death of a Muslim journalist. Now, that threat went straight through your detection systems, it spread very quickly, and then it took attempt after attempt after attempt, and the involvement of civil society groups, to get you to remove it.

Why couldn't it be removed within 24 hours?

ZUCKERBERG: Senator, what's happening in Myanmar is a terrible tragedy, and we need to do more ...

(CROSSTALK)

LEAHY: We all agree with that.

ZUCKERBERG: Okay.

LEAHY: But U.N. investigators have blamed you — blamed Facebook for playing a role in the genocide. We all agree it's terrible. How can you dedicate, and will you dedicate, resources to make sure such hate speech is taken down within 24 hours?

ZUCKERBERG: Yes. We're working on this. And there are three specific things that we're doing.

One is we're hiring dozens of more Burmese-language content reviewers, because hate speech is very language-specific. It's hard to do it without people who speak the local language, and we need to ramp up our effort there dramatically.

Second is we're working with civil society in Myanmar to identify specific hate figures so we can take down their accounts, rather than specific pieces of content.

And third is we're standing up a product team to do specific product changes in Myanmar and other countries that may have similar issues in the future to prevent this from happening.

LEAHY: Senator Cruz and I sent a letter to Apple, asking what they're going to do about Chinese censorship. My question, I'll place ...

THUNE: That'd be great. Thank you, Senator Leahy.

LEAHY: ... I'll place for the record — I want to know what you will do about Chinese censorship, when they come to you.

THUNE: Senator Graham's up next.

SEN. LINDSEY O. GRAHAM (R-S.C.): Thank you.

Are you familiar with Andrew Bosworth?

ZUCKERBERG: Yes, senator, I am.

GRAHAM: He said, “So we connect more people. Maybe someone dies in a terrorist attack coordinated on our tools. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people, more often, is de facto good.” Do you agree with that?

ZUCKERBERG: No, senator, I do not. And, as context, Boz wrote that — Boz is what we call him internally — he wrote that as an internal note. We have a lot of discussion internally. I disagreed with it at the time that he wrote it. If you looked at the comments on the internal discussion ...

GRAHAM: Would you say ...

ZUCKERBERG: ... the vast majority of people internally did, too.

GRAHAM: ... that you did a poor job, as a CEO, communicating your displeasure with such thoughts? Because, if he had understood where you — where you were at, he would have never said it to begin with.

ZUCKERBERG: Well, senator, we try to run our company in a way where people can express different opinions internally.

GRAHAM: Well, this is an opinion that really disturbs me. And, if somebody worked for me that said this, I'd fire them.

Who's your biggest competitor?

ZUCKERBERG: Senator, we have a lot of competitors.

GRAHAM: Who's your biggest?

ZUCKERBERG: I think the categories of — did you want just one? I'm not sure I can give one, but can I give a bunch?

GRAHAM: Yes.

ZUCKERBERG: So there are three categories that I would focus on. One are the other tech platforms — so Google, Apple, Amazon, Microsoft — we overlap with them in different ways.

GRAHAM: Do they do — do they provide the same service you provide?

ZUCKERBERG: In different ways — different parts of it, yes.

GRAHAM: Let me put it this way. If I buy a Ford, and it doesn't work well, and I don't like it, I can buy a Chevy. If I'm upset with Facebook, what's the equivalent product that I can go sign up for?

ZUCKERBERG: Well, there — the second category that I was going to talk about are ...

(CROSSTALK)

GRAHAM: I'm not talking about categories. I'm talking about, is there real competition you face? Because car companies face a lot of competition. If they make a defective car, it gets out in the world, people stop buying that car; they buy another one.

Is there an alternative to Facebook in the private sector?

ZUCKERBERG: Yes, Senator. The average American uses eight different apps to communicate with their friends and stay in touch with people ...

(CROSSTALK)

GRAHAM: Okay. Which is ...

ZUCKERBERG: ... ranging from texting apps, to email, to ...

GRAHAM: ... is the same service you provide?

ZUCKERBERG: Well, we provide a number of different services.

GRAHAM: Is Twitter the same as what you do?

ZUCKERBERG: It overlaps with a portion of what we do.

GRAHAM: You don't think you have a monopoly?

ZUCKERBERG: It certainly doesn't feel like that to me.

GRAHAM: Okay.

(LAUGHTER)

So it doesn't. So, Instagram — you bought Instagram. Why did you buy Instagram?

ZUCKERBERG: Because they were very talented app developers who were making good use of our platform and understood our values.

GRAHAM: It is a good business decision. My point is that one way to regulate a company is through competition, through government regulation. Here's the question that all of us got to answer: What do we tell our constituents, given what's happened here, why we should let you self-regulate?

What would you tell people in South Carolina, that given all of the things we've just discovered here, it's a good idea for us to rely upon you to regulate your own business practices?

ZUCKERBERG: Well, senator, my position is not that there should be no regulation.

GRAHAM: Okay.

ZUCKERBERG: I think the Internet is increasingly ...

(CROSSTALK)

GRAHAM: You embrace regulation?

ZUCKERBERG: I think the real question, as the Internet becomes more important in people's lives, is what is the right regulation, not whether there should be or not.

GRAHAM: But — but you, as a company, welcome regulation?

ZUCKERBERG: I think, if it's the right regulation, then yes.

GRAHAM: You think the Europeans had it right?

ZUCKERBERG: I think that they get things right.

GRAHAM: Have you ever submitted ...

(LAUGHTER)

That's true. So would you work with us in terms of what regulations you think are necessary in your industry?

ZUCKERBERG: Absolutely.

GRAHAM: Okay. Would you submit to us some proposed regulations?

ZUCKERBERG: Yes. And I'll have my team follow up with you so, that way, we can have this discussion across the different categories where I think that this discussion needs to happen.

GRAHAM: Look forward to it.

When you sign up for Facebook, you sign up for a terms of service. Are you familiar with that?

ZUCKERBERG: Yes.

GRAHAM: Okay. It says, “The terms govern your use of Facebook and the products, features, apps, services, technologies, software we offer — Facebook's products or products — except where we expressly state that separate terms, and not these, apply.”

I'm a lawyer. I have no idea what that means. But, when you look at terms of service, this is what you get. Do you think the average consumer understands what they're signing up for?

ZUCKERBERG: I don't think that the average person likely reads that whole document.

GRAHAM: Yeah.

ZUCKERBERG: But I think that there are different ways that we can communicate that, and have a responsibility to do so.

GRAHAM: Do you — do you agree with me that you better come up with different ways, because this ain't working?

ZUCKERBERG: Well, senator, I think, in certain areas, that is true. And I think, in other areas, like the core part of what we do — right, if you — if you think about — just, at the most basic level, people come to Facebook, Instagram, WhatsApp, Messenger, about a hundred billion times a day to share a piece of content or a message with a specific set of people.

And I think that that basic functionality people understand, because we have the controls in line every time, and given the volume of — of — of the activity, and the value that people tell us that they're getting from that, I think that that control in line does seem to be working fairly well.

Now we can always do better, and there are other — the services are complex, and there is more to it than just — you know, you go and you post a photo, so I — I — I agree that — that in many places we could do better.

But I think for the quarter of the service, it actually is quite clear.

GRASSLEY: Thank you, Senator Graham.

Senator Klobuchar.

SEN. AMY KLOBUCHAR (D-MINN): Thank you, Mr. Chairman. Mr. Zuckerberg, I think we all agree that what happened here was bad. You acknowledged it was a breach of trust. And the way I explain it to my constituents is that if someone breaks into my apartment with the crowbar and they take my stuff, it's just like if the manager gave them the keys or if they didn't have any locks on the doors, it's still a breach; it's still a break in. And I believe we need to have laws and rules that are sophisticated as the — the brilliant products that you've developed here. And we just haven't done that yet.

And one of the areas that I've focused on is the election. And I appreciate the support that you and Facebook, and now Twitter, actually, have given to the Honest Ads Act bill that you mentioned, that I'm leading with Senator McCain and Senator Warner.

And I just want to be clear, as we work to pass this law so that we have the same rules in place to disclose political ads and issue ads as we do for TV and radio, as well as disclaimers, that you're going to take early action, as soon as June I heard, before this election so that people can view these ads, including issue ads. Is that correct?

ZUCKERBERG: That is correct, senator. And I just want to take a moment before I go into this in more detail to thank you for your leadership on this. This, I think, is an important area for the whole industry to move on.

The two specific things that we're doing are — one is around transparency, so now you're going to be able to go and click on any advertiser or any page on Facebook and see all of the ads that they're running. So that actually brings advertising online — on Facebook to an even higher standard than what you would have on TV or print media, because there's nowhere where you can see all of the TV ads that someone is running, for example. Whereas you will be able to see now on Facebook whether this campaign or third party is saying different messages to different types of people, and I think that that's a really important element of transparency.

But the other really important piece is around verifying every single advertiser who's going to be running political or issue ads.

KLOBUCHAR: I appreciate that. And Senator Warner and I have also called on Google and the other platforms to do the same. So memo to the rest of you, we have to get this done or we're going to have a patchwork of ads, and I hope that you'll be working with us to pass this bill. Is that right?

ZUCKERBERG: We will.

KLOBUCHAR: Okay, thank you.

Now on the subject of Cambridge Analytica, were these people, the 87 million people, users, concentrated in certain states? Are you able to figure out where they're from?

ZUCKERBERG: I do not have that information with me, but we can follow up with your — your office.

KLOBUCHAR: Okay, because as we know, that election was close, and it was only thousands of votes in certain states. You've also estimated that roughly 126 people — million people may have been shown content from a Facebook page associated with the Internet Research Agency.

Have you determined when — whether any of those people were the same Facebook users who's data was shared with Cambridge Analytica? Are you able to make that determination?

ZUCKERBERG: Senator, we're investigating that now. We believe that it is entirely possible that there will be a connection there.

KLOBUCHAR: Okay, that seems like a big deal as we look back at that last election. Former Cambridge Analytica employee Christopher Wiley has said that the data that it improperly obtained — that Cambridge Analytica improperly obtained from Facebook users could be stored in Russia.

Do you agree that that's a possibility?

ZUCKERBERG: Sorry, are you — are you asking if Cambridge Analytica's data — data could be stored in Russia?

KLOBUCHAR: That's what he said this weekend on a Sunday show.

ZUCKERBERG: Senator, I don't have any specific knowledge that would suggest that.

But one of the steps that we need to take now is go do a full audit of all of Cambridge Analytica's systems to understand what they're doing, whether they still have any data, to make sure that they remove all the data. If they don't, we're going to take legal action against them to do so.

That audit, we have temporarily ceded that in order to let the U.K. government complete their government investigation first, because, of course, a government investigation takes precedence against a company doing that. But we are committed to completing this full audit and getting to the bottom of what's going on here, so that way we can have more answers to this.

KLOBUCHAR: Okay.

You earlier stated publicly and here that you would support some privacy rules so that everyone's playing by the same rules here. And you also said here that you should have notified customers earlier.

Would you support a rule that would require you to notify your users of a breach within 72 hours?

ZUCKERBERG: Senator, that makes sense to me. And I think we should have our team follow up with — with yours to — to discuss the details around that more.

KLOBUCHAR: Thank you.

I just think part of this was when people don't even know that their data's been breached, that's a huge problem. And I also think we get to solutions faster when we get that information out there.

Thank you. And we look forward to passing this bill — we'd love to pass it before the election — on the honest ads. And I'm looking forward to better disclosure this election.

Thank you.

THUNE: Thank you, Senator Klobuchar.

Senator Blunt's up next.

SEN. ROY BLUNT (R-MO): Thank you, Mr. Chairman.

Mr. Zuckerberg, nice to see you.

When I saw you not too long after I entered the Senate in 2011, I told you, when I sent my business cards down to be printed, they came back from the Senate print shop with the message that it was the first business card they'd ever printed a Facebook address on.

There are days when I've regretted that, but more days when we get lots of information that we need to get. There are days when I wonder if “Facebook friends” is a little misstated. It doesn't seem like I have those every single day.

But, you know, the — the platform you've created is really important. And my son Charlie, who's 13, is dedicated to Instagram. So he'd want to be sure I mentioned him while I was here with — with you.

I haven't printed that on my card yet, I — I will — will say that, but I think we have that account as well. Lots of ways to connect people.

And the — the information, obviously, is an important commodity and it's what makes your business work. I get that.

However, I wonder about some of the collection efforts. And maybe we can go through largely just even “yes” and “no” and then we'll get back to more expansive discussion of this.

But do you collect user data through cross-device tracking?

ZUCKERBERG: Senator, I believe we do link people's accounts between devices in order to make sure that their Facebook and Instagram and their other experiences can be synced between their devices.

BLUNT: And that would also include offline data, data that's tracking that's not necessarily linked to Facebook, but linked to one — some device they went through Facebook on, is that right?

ZUCKERBERG: Senator, I want to make sure we get this right. So I want to have my team follow up with you on that afterwards.

BLUNT: Well, now, that doesn't seem that complicated to me. Now, you — you understand this better than I do, but maybe — maybe you can explain to me why that's that — why that's complicated.

Do you track devices that an individual who uses Facebook has that is connected to the device that they use for their Facebook connection, but not necessarily connected to Facebook?

ZUCKERBERG: I'm not — I'm not sure of the answer to that question.

BLUNT: Really?

ZUCKERBERG: Yes. There — there may be some data that is necessary to provide the service that we do. But I don't — I don't have that on — sitting here today. So that's something that I would want to follow up on.

BLUNT: Now, the FTC, last year, flagged cross-device tracking as one of their concerns — generally, that people are tracking devices that the users of something like Facebook don't know they're being tracked.

How do you disclose your collected — collection methods? Is that all in this document that I would see and agree to before I entered into Facebook?

ZUCKERBERG: Yes, senator. So there are — there are two ways that we do this. One is we try to be exhaustive in the legal documents, or on the terms of service and privacy policies. But, more importantly, we try to provide in-line controls so that — that are in plain English, that people can understand.

They can either go to settings, or we can show them at the top of the app, periodically, so that people understand all the controls and settings they have and can — can configure their experience the way that they want.

BLUNT: So do people — do people now give you permission to track specific devices in their contract? And, if they do, is that a relatively new addition to what you do?

ZUCKERBERG: Senator, I'm sorry. I don't have that.

BLUNT: Am I able to — am I able to opt out? Am I able to say, “It's okay for you to track what I'm saying on Facebook, but I don't want you to track what I'm texting to somebody else, off Facebook, on an Android phone."?

ZUCKERBERG: Okay. Yes, senator. In — in general, Facebook is not collecting data from other apps that you use. There may be some specific things about the device that you're using that Facebook needs to understand in order to offer the service.

But, if you're using Google or you're using some texting app, unless you specifically opt in that you want to share the texting app information, Facebook wouldn't see that.

BLUNT: Has it always been that way? Or is that a recent addition to how you deal with those other ways that I might communicate?

ZUCKERBERG: Senator, my understanding is that that is how the mobile operating systems are architected.

BLUNT: The — so do you — you don't have bundled permissions for how I can agree to what devices I may use, that you may have contact with? Do you — do you bundle that permission? Or am I able to, one at a — individually say what I'm willing for you to — to watch, and what I don't want you to watch?

And I think we might have to take that for the record, based on everybody else's time.

THUNE: Thank you, Senator Blunt.

Next up, Senator Durbin.

SEN. RICHARD J. DURBIN (D-ILL): Thanks very much, Mr. Chairman.

Mr. Zuckerberg, would you be comfortable sharing with us the name of the hotel you stayed in last night?

ZUCKERBERG: No.

(LAUGHTER)

DURBIN: If you messaged anybody this week, would you share with us the names of the people you've messaged?

ZUCKERBERG: Senator, no. I would probably not choose to do that publicly, here.

DURBIN: I think that may be what this is all about: your right to privacy, the limits of your right to privacy and how much you give away in modern America in the name of, quote, “connecting people around the world;” a question, basically, of what information Facebook's collecting, who they're sending it to and whether they ever asked me, in advance, my permission to do that. Is that a fair thing for the user of Facebook to expect?

ZUCKERBERG: Yes, senator. I think everyone should have control over how their information is used. And as we've talked about in some of the other questions, I think of that is laid out in and some of the documents, but more importantly, you want your people control in the product itself.

So the most important way that this happens across our services is that every day, people come to our services to choose to share photos or send messages, and every single time they choose to share something, there — they have a control right there about who they want to share it with. But that level of control is extremely important.

DURBIN: They certainly know within the Facebook pages who their friends are, but they may not know as has happened — and you've conceded this point in the past, that sometimes that information is going way beyond there friends, and sometimes people have made money off of sharing that information, correct?

ZUCKERBERG: Senator, you are referring I think to our developer platform, and it may be useful for me to give some background on how we set that up, if that's useful.

DURBIN: I have three minutes left, so maybe you can do that for the record, because I have couple other questions I would like to ask. You have recently announced something that is called Messenger Kids. Facebook created an app allowing kids between the ages of 6 and 12 to send video and text messages through Facebook as an extension of their parent's account. You have cartoonlike stickers, and other features designed to appeal to little kids — first-graders, kindergartners.

On January 30th, the Campaign for Commercial-Free Childhood and lots of other child development organizations warned Facebook. They pointed to a wealth of research demonstrating the excessive use of digital devices and social media is harmful to kids, and argued that young children simply are not ready to handle social media accounts at age 6. In addition, their concerns about data that is being gathered about these kids.

Now, there are certain limits of the law, we know. There's a Children's Online Privacy Protection Act. What guarantees can you give us the note data from Messenger Kids is or will be collected or shared with those of might violate that law?

ZUCKERBERG: All right, senator, so a number of things I think are — are important here. The background on Messenger Kids is, we heard feedback from thousands of parents that they want to be able to stay in touch with their kids and call them, use apps like FaceTime when they're working late or not around and want to communicate with their kids, but they want to have complete control over that. So I think we can all agree that if you — when your kid is 6 or 7, even if they have access to a phone, you want to control everyone who they can contact. And there was an app out there that did that. So we build this service to do that.

The app collects a minimum amount of information that is necessary to operate the service. So, for example, the messages that people send is something that we collect in order to operate the service, but in general, that data is not going to be shared with third parties, it is not connected to the broader Facebook ...

DURBIN: Excuse me, as a lawyer, I picked up on that word “in general,” the phrase “in general.” It seems to suggest that in some circumstances it will be shared with third parties.

ZUCKERBERG: No. It will not.

DURBIN: All right. Would you be open to the idea that someone having reached adult age, having grown up with Messenger Kids, should be allowed to delete the data that you collected?

ZUCKERBERG: Senator, yes. As a matter of fact, when you become 13, which is our legal limit — our limit — we don't allow people under the age of 13 to use Facebook — you don't automatically go from having a Messenger Kids account to a Facebook account. You have to start over and get a Facebook account.

So I think it's a good idea to consider making sure that all that information is deleted, and in general, people are going to be starting over when get their — their Facebook or other accounts.

DURBIN: I'll close, because I just have a few seconds. Illinois has a Biometric Information Privacy Act, or the state does, which is to regulate the commercial use of facial, voice, finger and iris scans and the like. We're now in a fulsome debate on that. And I'm afraid Facebook has come down to the position of trying to carve out exceptions to that. I hope you'll fill me in on how that is consistent with protecting privacy. Thank you.

THUNE: Thank you, Senator Durbin.

Senator Cornyn?

SEN. JOHN CORNYN (R-TEX): Thank you, Mr. Zuckerberg, for being here. I know in — up until 2014, a mantra or motto of Facebook was move fast and break things. Is that correct?

ZUCKERBERG: I don't know when we changed it, but the mantra is currently move fast with stable infrastructure, which is a much less sexy mantra.

CORNYN: Sounds much more boring. But my question is, during the time that it was Facebook's mantra or motto to move fast and break things, do you think some of the misjudgments, perhaps mistakes that you've admitted to here, were as a result of that culture or that attitude, particularly as it regards to personal privacy of the information of your subscribers?

ZUCKERBERG: Senator, I do think that we made mistakes because of that. But the broadest mistakes that we made here are not taking a broad enough view of our responsibility. And while that wasn't a matter — the “move fast” cultural value is more tactical around whether engineers can ship things and — and different ways that we operate.

But I think the big mistake that we've made looking back on this is viewing our responsibility as just building tools, rather than viewing our whole responsibility as making sure that those tools are used for good.

CORNYN: Well I — and I appreciate that. Because previously, or earlier in the past, we've been told that platforms like Facebook, Twitter, Instagram, the like are neutral platforms, and the people who own and run those for profit — and I'm not criticizing doing something for profit in this country.

But they bore no responsibility for the content. Do you agree now that Facebook and the other social media platforms are not neutral platforms, but bear some responsibility for the content?

ZUCKERBERG: I agree that we're responsible for the content, but I think that there's — one of the big societal questions that I think we're going to need to answer is the current framework that we have is based on this reactive model, that assumed that there weren't A.I. tools that could proactively tell, you know, whether something was terrorist content or something bad, so it naturally relied on requiring people to flag for a company, and then the company needing to take reasonable action.

In the future, we're going to have tools that are going to be able to identify more types of bad content. And I think that there is — there are moral and legal obligation questions that I think we'll have to wrestle with as a society about when we want to require companies to take action proactively on certain of those things, and when that gets in the way of ...

CORNYN: I appreciate that, I have two minutes left ...

ZUCKERBERG: All right.

CORNYN: ... to ask you questions.

So you — you — interestingly, the terms of the — what do you call it, the terms of service is a legal document which discloses to your subscribers how their information is going to be used, how Facebook is going to operate.

CORNYN: And — but you concede that — you doubt everybody reads or understands that legalese, those terms of service. So are — is that to suggest that the consent that people give subject to that terms of service is not informed consent? In other words, they may not read it, and even if they read it, they may not understand it?

ZUCKERBERG: I just think we have a broader responsibility than what the law requires. So I — what you ...

CORNYN: No, I'm talking — I'm talking about — I appreciate that. What I'm asking about, in terms of what your subscribers understand, in terms of how their data is going to be used — but let me go to the terms of service.

Under paragraph number two, you say, “You own all of the content and information you post on Facebook.” That's what you've told us here today, a number of times.

So, if I chose to terminate my Facebook account, can I bar Facebook or any third parties from using the data that I had previously supplied, for any purpose whatsoever?

ZUCKERBERG: Yes, senator. If you delete your account, we should get rid of all of your information.

CORNYN: You should? Or do you?

ZUCKERBERG: We do. We do.

CORNYN: How about third parties that you have contracted with to use some of that underlying information, perhaps to target advertising for themselves? You can't — do you — do you call back that information, as well? Or does that remain in their custody?

ZUCKERBERG: Well, senator, this is actually a very important question, and I'm glad you brought this up, because there's a very common misperception about Facebook — that we sell data to advertisers. And we do not sell data to advertisers. We don't sell data to anyone.

CORNYN: Well, you clearly rent it.

ZUCKERBERG: What we allow is for advertisers to tell us who they want to reach, and then we do the placement. So, if an advertiser comes to us and says, “All right, I am a ski shop and I want to sell skis to women,” then we might have some sense, because people shared skiing-related content, or said they were interested in that, they shared whether they're a woman, and then we can show the ads to the right people without that data ever changing hands and going to the advertiser.

That's a very fundamental part of how our model works and something that is often misunderstood. So I'm — I appreciate that you brought that up.

THUNE: Thank you, Senator Cornyn.

We had indicated earlier on that we would take a couple of breaks, give our witness an opportunity. And I think we've been going, now, for just under two hours. So I think what we'll do is ...

(CROSSTALK)

ZUCKERBERG: You can do a few more.

(LAUGHTER)

THUNE: You — you're — you want to keep going?

ZUCKERBERG: Maybe — maybe 15 minutes. Does that work?

THUNE: Okay. All right, we'll keep going.

Senator Blumenthal is up next. And we will commence.

SEN. RICHARD BLUMENTHAL (D-CONN): Thank you, Mr. Chairman. Thank you for being here today, Mr. Zuckerberg.

You have told us today — and you've told the world — that Facebook was deceived by Aleksandr Kogan when he sold user information to Cambridge Analytica, correct?

ZUCKERBERG: Yes.

BLUMENTHAL: I want to show you the terms of service that Aleksandr Kogan provided to Facebook and note for you that, in fact, Facebook was on notice that he could sell that user information.

Have you seen these terms of service before?

ZUCKERBERG: I have not.

BLUMENTHAL: Who in Facebook was responsible for seeing those terms of service that put you on notice that that information could be sold?

ZUCKERBERG: Senator, our app review team would be responsible for that. Had ...

BLUMENTHAL: Has anyone been fired on that app review team?

ZUCKERBERG: Senator, not because of this.

BLUMENTHAL: Doesn't that term of service conflict with the FTC order that Facebook was under at that very time that this term of service was, in fact, provided to Facebook. And you'll note that the Face — the FTC order specifically requires Facebook to protect privacy. Isn't there a conflict there?

ZUCKERBERG: Senator, it certainly appears that we should have been aware that this app developer submitted a term that was in conflict with the rules of the platform.

BLUMENTHAL: Well, what happened here was, in effect, willful blindness. It was heedless and reckless, which, in fact, amounted to a violation of the FTC consent decree. Would you agree?

ZUCKERBERG: No, senator. My understanding is that — is not that this was a violation of the consent decree.

But as I've said a number of times today, I think we need to take a broader view of our responsibility around privacy than just what is mandated in the current law.

BLUMENTHAL: Well, here is my reservation, Mr. Zuckerberg. And I apologize for interrupting you, but my time is limited.

We've seen the apology tours before. You have refused to acknowledge even an ethical obligation to have reported this violation of the FTC consent decree. And we have letters — we've had contacts with Facebook employees. And I am going to submit a letter for the record from Sandy Parakilas, with your permission, that indicates not only a lack of resources, but lack of attention to privacy.

And so, my reservation about your testimony today is that I don't see how you can change your business model unless there are specific rules of the road.

Your business model is to monetize user information to maximize profit over privacy. And unless there are specific rules and requirements enforced by an outside agency, I have no assurance that these kinds of vague commitments are going to produce action.

So I want to ask you a couple of very specific questions. And they are based on legislation that I've offered, the MY DATA Act; legislation that Senator Markey is introducing today, the CONSENT Act, which I'm joining.

Don't you agree that companies ought to be required to provide users with clear, plain information about how their data will be used, and specific ability to consent to the use of that information?

ZUCKERBERG: Senator, I do generally agree with what you're saying. And I laid that out earlier when I talked about what ...

BLUMENTHAL: Would you agree to an opt-in as opposed to an opt-out?

ZUCKERBERG: Senator, I think that — that certainly makes sense to discuss. And I think the details around this matter a lot.

BLUMENTHAL: Would you agree that users should be able to access all of their information?

ZUCKERBERG: Senator, yes. Of course.

BLUMENTHAL: All of the information that you collect as a result of purchases from data brokers, as well as tracking them?

ZUCKERBERG: Senator, we have already a “download your information” tool that allows people to see and to take out all of the information that Facebook — that they've put into Facebook or that Facebook knows about them. So, yes, I agree with that. We already have that.

BLUMENTHAL: I have a number of other specific requests that you agree to support as part of legislation. I think legislation is necessary. The rules of the road have to be the result of congressional action.

We have — Facebook has participated recently in the fight against scourge — the scourge of sex trafficking. And a bill that we've just passed — it will be signed into law tomorrow — SESTA, the Stop Exploiting Sex Trafficking Act — was the result of our cooperation. I hope that we can cooperate on this kind of measure as well.

ZUCKERBERG: Senator, I look forward to having my team work with you on this.

THUNE: Thank you, Senator Blumenthal.

Senator Cruz.

SEN. TED CRUZ (R-TEX): Thank you Mr. Chairman. Mr. Zuckerberg, welcome. Thank you for being here.

Mr. Zuckerberg, does Facebook consider itself a neutral public forum?

ZUCKERBERG: Senator, we consider ourselves to be a platform for all ideas.

CRUZ: Let me ask the question again. Does Facebook consider itself to be a neutral public forum, and representatives of your company are giving conflicting answers on this? Are you a ...

ZUCKERBERG: Well ...

CRUZ: ... First Amendment speaker expressing your views, or are you a neutral public forum allowing everyone to speak?

ZUCKERBERG: Senator, here's how we think about this: I don't believe that — there are certain content that clearly we do not allow, right? Hate speech, terrorist content, nudity, anything that makes people feel unsafe in the community. From that perspective, that's why we generally try to refer to what we do as platform for all ideas ...

CRUZ: Let me try this, because the time is constrained. It's just a simple question. The predicate for Section 230 immunity under the CDA is that you're a neutral public forum. Do you consider yourself a neutral public forum, or are you engaged in political speech, which is your right under the First Amendment.

ZUCKERBERG: Well, senator, our goal is certainly not to engage in political speech. I am not that familiar with the specific legal language of the — the law that you — that you speak to. So I would need to follow up with you on that. I'm just trying to lay out how broadly I think about this.

CRUZ: Mr. Zuckerberg, I will say there are a great many Americans who I think are deeply concerned that that Facebook and other tech companies are engaged in a pervasive pattern of bias and political censorship. There have been numerous instances with Facebook in May of 2016, Gizmodo reported that Facebook had purposely and routinely suppressed conservative stories from trending news, including stories about CPAC, including stories about Mitt Romney, including stories about the Lois Lerner IRS scandal, including stories about Glenn Beck.

In addition to that, Facebook has initially shut down the Chick-fil-A Appreciation Day page, has blocked a post of a Fox News reporter, has blocked over two dozen Catholic pages, and most recently blocked Trump supporters Diamond and Silk's page, with 1.2 million Facebook followers, after determining their content and brand were, quote, “unsafe to the community.”

To a great many Americans that appears to be a pervasive pattern of political bias. Do you agree with that assessment?

ZUCKERBERG: Senator, let me say a few things about this. First, I understand where that concern is coming from, because Facebook in the tech industry are located in Silicon Valley, which is an extremely left-leaning place, and I — this is actually a concern that I have and that I try to root out in the company, is making sure that we do not have any bias in the work that we do, and I think it is a fair concern that people would at least wonder about. Now ...

CRUZ: Let me — let me ask this question: Are you aware of any ad or page that has been taken down from Planned Parenthood?

ZUCKERBERG: Senator, I'm not. But let me just ...

CRUZ: How about moveon.org?

ZUCKERBERG: Sorry.

CRUZ: How about moveon.org?

ZUCKERBERG: I'm not specifically aware of those ...

CRUZ: How about any Democratic candidate for office?

ZUCKERBERG: I'm not specifically aware. I mean, I'm not sure.

CRUZ: In your testimony, you say that you have 15,000 to 20,000 people working on security and content review. Do you know the political orientation of those 15,000 to 20,000 people engaging engaged in content review?

ZUCKERBERG: No, senator. We do not generally ask people about their political orientation when they're joining the company.

CRUZ: So as CEO, have you ever made hiring or firing decisions based on political positions or what candidates they supported?

ZUCKERBERG: No.

CRUZ: Why was Palmer Luckey fired?

ZUCKERBERG: That is a specific personnel matter that seems like it would be inappropriate to speak to here.

CRUZ: You just made a specific representation, that you didn't make decisions based on political views. Is that accurate?

ZUCKERBERG: Well, I can — I can commit that it was not because of a political view.

CRUZ: Do you know, of those 15 to 20,000 people engaged in content review, how many, if any, have ever supported, financially, a Republican candidate for office?

ZUCKERBERG: Senator, I do not know that.

CRUZ: Your testimony says, “It is not enough that we just connect people. We have to make sure those connections are positive.” It says, “We have to make sure people aren't using their voice to hurt people or spread misinformation. We have a responsibility, not just to build tools, to make sure those tools are used for good.”

Mr. Zuckerberg, do you feel it's your responsibility to assess users, whether they are good and positive connections or ones that those 15 to 20,000 people deem unacceptable or deplorable?

ZUCKERBERG: Senator, you're asking about me personally?

CRUZ: Facebook.

ZUCKERBERG: Senator, I think that there are a number of things that we would all agree are clearly bad. Foreign interference in our elections, terrorism, self-harm. Those are things ...

CRUZ: I'm talking about censorship.

ZUCKERBERG: Well, I — I think that you would probably agree that we should remove terrorist propaganda from the service. So that, I agree. I think it is — is clearly bad activity that we want to get down. And we're generally proud of — of how well we — we do with that.

Now what I can say — and I — and I do want to get this in before the end, here — is that I am — I am very committed to making sure that Facebook is a platform for all ideas. That is a — a very important founding principle of — of what we do.

We're proud of the discourse and the different ideas that people can share on the service, and that is something that, as long as I'm running the company, I'm going to be committed to making sure is the case.

CRUZ: Thank you.

THUNE: Thank you, Senator Cruz.

Do you want to break now?

(LAUGHTER)

Or do you want to keep going?

ZUCKERBERG: Sure. I mean, that was — that was pretty good. So. All right.

THUNE: All right. We have — Senator Whitehouse is up next. But if you want to take a ...

ZUCKERBERG: Yeah.

THUNE: ... a five-minute break right now, we have now been going a good two hours, so ...

ZUCKERBERG: Thank you.

THUNE: ... I will be — we'll recess for five minutes and reconvene.

(RECESS)

GRASSLEY: We'll come to order.

(CROSSTALK)

GRASSLEY: Oh, okay. I want to read this first.

Before I call on Senator Whitehouse, Senator Feinstein asked permission to put letters and statements in the record, and without objection they will be put in from the ACLU, the Electronic Privacy Information Center, the Association for Computing — Computing Machinery Public Policy Council and Public Knowledge.

Senator Whitehouse?

SEN. SHELDON WHITEHOUSE (D-RI): Thank you, Chairman.

ZUCKERBERG: Thank you. Mr. Chairman, I want to correct one thing that I said earlier in response to a question from Senator Leahy. He had asked if — why we didn't ban Cambridge Analytica at the time when we learned about them in 2015. And I answered that what my — what my understanding was, was that they were not on the platform, were not an app developer or advertiser. When I went back and met with my team afterwards, they let me know that Cambridge Analytica actually did start as an advertiser later in 2015. So we could have in theory banned them then. We made a mistake by not doing so. But I just wanted to make sure that I updated that because I — I — I misspoke, or got that wrong earlier.

GRASSLEY: (OFF-MIKE) Whitehouse?

WHITEHOUSE: Thank you, Chairman.

Welcome back, Mr. Zuckerberg.

On the subject of bans, I just wanted to explore a little bit what these bans mean. Obviously Facebook has been done considerable reputational damage by it's association with Aleksandr Kogan and with Cambridge Analytica, which is one of the reasons you're having this enjoyable afternoon with us. Your testimony says that Aleksandr Kogan's app has been banned. Has he also been banned?

ZUCKERBERG: Yes, my understanding is he has.

WHITEHOUSE: So if he were to open up another account under a name and you were able to find out that would be taken — that would be closed down?

ZUCKERBERG: Senator, I believe we — we are preventing him from building any more apps.

WHITEHOUSE: Does he have a Facebook account still?

ZUCKERBERG: Senator, I believe the answer to that is no, but I can follow up with you afterwards.

WHITEHOUSE: Okay. And with respect to Cambridge Analytica, your testimony is that first you required them to formally certify that they had deleted all improperly acquired data. Where did that formal certification take place? That sounds kind of like a quasi-official thing, to formally certify. What did that entail?

ZUCKERBERG: Senator, first they sent us an email notice from their chief data officer telling us that they didn't have any of the data any more, that they deleted it and weren't using it. And then later we followed up with, I believe, a full legal contract where they certified that they had deleted the data.

WHITEHOUSE: In a legal contract?

ZUCKERBERG: Yes, I believe so.

WHITEHOUSE: Okay. And then you ultimately said that you have banned Cambridge Analytica. Who exactly is banned? What if they opened up Princeton, Rhode Island Analytica? Different corporate form, same enterprise. Would that enterprise also be banned?

ZUCKERBERG: Senator, that is certainly the intent. Cambridge Analytica actually has a parent company and we banned the parent company. And recently we also banned a firm called AIQ, which I think is also associated with them. And if we find other firms that are associated with them, we will block them from the platform as well.

WHITEHOUSE: Are individual principals — P-A-L-S, principals of the firm also banned?

ZUCKERBERG: Senator, my understanding is we're blocking them from doing business on the platform, but I do not believe that we're blocking people's personal accounts.

WHITEHOUSE: okay. Can any customer amend your terms of service? Or is the terms of service a take it or leave it proposition for the average customer?

ZUCKERBERG: Senator, I think the terms of service are what they are. But the service is really defined by people. Because you get to choose what information you share, and the whole service is about what friends you connect to, which people you choose to connect to ...

WHITEHOUSE: Yes, I guess my question would relate to — Senator Graham held up that big, fat document. It's easy to put a lot of things buried in a document that then later turn out to be of consequence. And all I wanted to establish with you is that that document that Senator Graham held up, that is not a negotiable thing with individual customers; that is a take it or leave it proposition for your customers to sign up to, or not use the service.

ZUCKERBERG: Senator, that's right on the terms of the service, although we offer a lot of controls so people can configure the experience how they want.

WHITEHOUSE: So, last question, on a different subject having to do with the authorization process that you are undertaking for entities that are putting up political content or so-called issue-ad content. You said that they all have to go through an authorization process before they do it. You said here we will be verifying the identity. How do you look behind a shell corporation and find who's really behind it through your authorization process?

Well, step back. Do you need to look behind shell corporations in order to find out who is really behind the content that's being posted? And if you may need to look behind a shell corporation, how will you go about doing that? How will you get back to the true, what lawyers would call, beneficial owner of the site that is putting out the political material?

ZUCKERBERG: Senator, are — are you referring to the verification of political and issue ads?

WHITEHOUSE: Yes, and before that, political ads, yes.

ZUCKERBERG: Yes. So what we're going to do is require a valid government identity and we're going to verify the location. So we're going to do that so that way someone sitting in Russia, for example, couldn't say that they're in America and, therefore, able to run an election ad.

WHITEHOUSE: But if they were running through a corporation domiciled in Delaware, you wouldn't know that they were actually a Russian owner.

ZUCKERBERG: Senator, that's — that's correct.

WHITEHOUSE: Okay. Thank you, my time has expired and I appreciate the courtesy of the chair for the extra seconds. Thank you, Mr. Zuckerberg.

GRASSLEY: Senator Lee.

SEN. MIKE LEE (R-UTAH): Thank you, Mr. Chairman. Mr. Zuckerberg, I wanted to follow up on a statement you made shortly before the break just a few minutes ago. You said that there are some categories of speech, some types of content that Facebook would never want to have any part of and takes active steps to avoid disseminating, including hate speech, nudity, racist speech, I — I — I assume you also meant terrorist acts, threats of physical violence, things like that.

Beyond that, would you agree that Facebook ought not be putting its thumb on the scale with regard to the content of speech, assuming it fits out of one of those categories that — that's prohibited?

ZUCKERBERG: Senator, yes. There are generally two categories of content that — that we're very worried about. One are things that could cause real world harm, so terrorism certainly fits into that, self-harm fits into that, I would consider election interference to fit into that and those are the types of things that we — I — I don't really consider there to be much discussion around whether those are good or bad topics.

LEE: Sure, yes, and I'm not disputing that. What I'm asking is, once you get beyond those categories of things that are prohibited, and should be, is it Facebook's position that it should not be putting its thumb on the scale; it should not be favoring or disfavoring speech based on its content, based on the viewpoint of that speech?

ZUCKERBERG: Senator, in general that's our position. What we — one of the things that is really important though is that in order to create a service where everyone has a voice, we also need to make sure that people aren't bullied, or — or basically intimidated, or the environment feels unsafe for them.

LEE: Okay. So when you say in general, that's the — the exception that you're referring to, the exception being that if someone feels bullied, even if it's not a terrorist act, nudity, terrorist threats, racist speech, or something like that you might step in there. Beyond that, would you step in and put your thumb on the scale as far as the viewpoint of the content being posted?

ZUCKERBERG: Senator, no. I mean, in general our — our goal is to allow people to have as much expression as possible.

LEE: Okay. So subject to the exceptions we've discussed, you would stay out of that.

Let me ask you this, isn't there a significant free market incentive that a social media company, including yours, has, in order to safeguard the data of your users? Don't you have free market incentives in that respect?

ZUCKERBERG: Yes, senator. Yes.

LEE: Does — don't your interests align with — with those of us here who want to see data safeguarded?

ZUCKERBERG: Absolutely.

LEE: Do you have the technological means available, at your disposal, to make sure that that doesn't happen and to — to protect, say, an app developer from transferring Facebook data to a third party?

ZUCKERBERG: Senator, a lot of that, we do. And some of that happens outside of our systems and will require new measures. And so, for example, what we saw here was people chose to share information with an app developer. That worked according to how the system was designed.

That information was then transferred out of our system to servers that this developer, Aleksandr Kogan, had. And then that person chose to then go sell the data to Cambridge Analytica.

That is going to require much more active intervention and auditing from us to prevent, going forward, because once it's out of our system it is a lot harder for us to have a full understanding of what's happening.

LEE: From what you've said today, and from previous statements made by you and other officials at your company, data is at the center of your business model. It's how you make money. Your ability to run your business effectively, given that you don't charge your users, is based on monetizing data.

And so the real issue, it seems to me, really comes down to what you tell the public, what you tell users of Facebook, about what you're going to do with the data. About how you're going to use it.

Can you — can you give me a couple of examples, maybe two examples, of ways in which data is collected by Facebook, in a way that people are not aware of? Two examples of types of data that Facebook collects that might be surprising to Facebook users?

ZUCKERBERG: Well, senator, I would hope that what we do with data is not surprising to people.

LEE: And has it been at times?

ZUCKERBERG: Well, senator, I think in this case, people certainly didn't expect this developer to sell the data to Cambridge Analytica. In general, there are two types of data that Facebook has.

The vast majority — and then the first category, is content that people chose to share on the service themselves. So that's all the photos that you share, the posts that you make, what you think of as the Facebook service, right? That's — everyone has control every single time that they go to share that. They can delete that data any time they want; full control, the majority of the data.

The second category is around specific data that we collect in order to make the advertising experiences better, and more relevant, and work for businesses. And those often revolve around measuring, okay, if you — if we showed you an ad, then you click through and you go somewhere else, we can measure that you actually — that the — that the ad worked. That helps make the experience more relevant and better for — for people, who are getting more relevant ads, and better for the businesses because they perform better.

You also have control completely of that second type of data. You can turn off the ability for Facebook to collect that — your ads will get worse, so a lot of people don't want to do that. But you have complete control over what you do there as well.

GRASSLEY: Senator Schatz?

SEN. BRIAN SCHATZ (D-HAWAII): Thank you, Mr. Chairman. I want to follow up on the questions around the terms of service. Your terms of service are about 3,200 words with 30 links. One of the links is to your data policy, which is about 2,700 words with 22 links. And I think the point has been well made that people really have no earthly idea of what they're signing up for.

And I understand that, at the present time, that's legally binding. But I'm wondering if you can explain to the billions of users, in plain language, what are they signing up for?

ZUCKERBERG: Senator, that's a good and important question here. In general, you know, you sign up for the Facebook, you get the ability to share the information that you want with — with people. That's what the service is, right? It's that you can connect with the people that you want, and you can share whatever content matters to you, whether that's photos or links or posts, and you get control over it.

SCHATZ: Who do you share it with?

ZUCKERBERG: And you can take it down if you want, and you don't need to put anything up in the first place if you don't want.

SCHATZ: What the part that people are worried about, not the fun part?

ZUCKERBERG: Well, what's that?

SCHATZ: The — the part that people are worried about is that the data is going to be improperly used. So people are trying to figure out are your D.M.s informing the ads? Are your browsing habits being collected?

Everybody kind of understands that when you click like on something or if you say you like a certain movie or have a — a particular political proclivity, that — I think that's fair game; everybody understands that.

What we don't understand exactly, because both as a matter of practice and as a matter of not being able to decipher those terms of service and the privacy policy is what exactly are you doing with the data and do you draw a distinction between data collected in the process of utilizing the platform, and that which we clearly volunteer to the public to present ourselves to other Facebook users?

ZUCKERBERG: Senator, I'm not sure I — I fully understand this. In — in general, you — your — you — people come to Facebook to share content with other people. We use that in order to also inform how we rank services like news feed and ads to provide more relevant experiences.

SCHATZ: Let me — let me try a couple of specific examples. If I'm email — if I'm mailing — emailing within WhatsApp, does that ever inform your advertisers?

ZUCKERBERG: No, we don't see any of the content in WhatsApp, it's fully encrypted.

SCHATZ: Right, but — but is there some algorithm that spits out some information to your ad platform and then let's say I'm emailing about Black Panther within WhatsApp, do I get a WhatsApp — do I get a Black Panther banner ad?

ZUCKERBERG: Senator, we don't — Facebook systems do not see the content of messages being transferred over WhatsApp.

SCHATZ: Yes, I know, but that's — that's not what I'm asking. I'm asking about whether these systems talk to each other without a human being touching it.

ZUCKERBERG: Senator, I think the answer to your specific question is, if you message someone about Black Panther in WhatsApp, it would not inform any ads.

SCHATZ: Okay, I want to follow up on Senator Nelson's original question which is the question of ownership of the data. And I understand as the sort of matter of principle, you were saying, you know, we want our customers to have more rather than less control over the data.

But I can't imagine that it's true as a legal matter that I actually own my Facebook data, because you're the one monetizing it. Do you want to modify that to sort of express that as a statement of principle, a sort of aspirational goal, but it doesn't seem to me that we own our own data, otherwise we'd be getting a cut.

ZUCKERBERG: Well, senator, you own it in the sense that you chose to put it there, you could take it down anytime, and you completely control the terms under which it's used.

When you put it on Facebook, you are granting us a license to be able to show it to other people. I mean, that's necessary in order for the service to operate.

SCHATZ: Right, but the — so the — the — so your definition of ownership is I sign up, I've voluntarily — and I may delete my account if I wish, but that's basically it.

ZUCKERBERG: Well, senator, I — I think that the control is much more granular than that. You can chose each photo that you want to put up or each message, and you can delete those.

And you don't need to delete your whole account, you have specific control. You can share different posts with different people.

SCHATZ: In the time I have left, I want to — I want to propose something to you and take it for the record. I read an interesting article this week by Professor Jack Balkin at Yale that proposes a concept of an information fiduciary.

People think of fiduciaries as responsible primarily in the economic sense, but this is really about a trust relationship like doctors and lawyers, tech companies should hold in trust our personal data.

Are you open to the idea of an information fiduciary and shrine and statute?

ZUCKERBERG: Senator, I think it's certainly an interesting idea, and Jack is very thoughtful in this space, so I do think it deserves consideration.

SCHATZ: Thank you.

THUNE: Senator Fischer?

FISCHER: Thank you, Mr. Chairman.

FISCHER: Thank you, Mr. Zuckerberg, for being here today. I appreciate your testimony.

The full scope of Facebook user's activity can print a very personal picture I think. And additionally, you have those 2 billion users that are out there every month. And so we all know that's larger than the population of most countries. So how many data categories do you store, does Facebook store, on the categories that you collect?

ZUCKERBERG: Senator, can you clarify what you mean by data categories?

FISCHER: Well, there's — there's some past reports that have been out there that indicate that it — that Facebook collects about 96 data categories for those 2 billion active users. That's 192 billion data points that are being generated, I think, at any time from consumers globally. So how many do — does Facebook store out of that? Do you store any?

ZUCKERBERG: Senator, I'm not actually sure what that is referring to.

FISCHER: On — on the points that you collect information, if we call those categories, how many do you store of information that you are collecting?

ZUCKERBERG: Senator, the way I think about this is there are two broad categories. This probably doesn't line up with whatever the — the specific report that you were seeing is. And I can make sure that we follow-up with you afterwards to get you the information you need on that. The two broad categories that I think about are content that a person is chosen to share and that they have complete control over, they get to control when they put into the service, when they take it down, who sees it. And then the other category are data that are connected to making the ads relevant. You have complete control over both. If you turn off the data related to ads, you can choose not to share any content or control exactly who sees it or take down the content in the former category.

FISCHER: And does Facebook store any of that?

ZUCKERBERG: Yes.

FISCHER: How much do you store of that? All of it? All of it? Everything we click on, is that in storage somewhere?

ZUCKERBERG: Senator, we store data about what people share on the service and information that's required to do ranking better, to show you what you care about in news feed.

FISCHER: Do you — do you store text history, user content, activity, device location?

ZUCKERBERG: Senator, some of that content with people's permission, we do store.

FISCHER: Do you disclose any of that?

ZUCKERBERG: Yes, it — Senator, in order to — for people to share that information with Facebook, I believe that almost everything that you just said would be opt in.

FISCHER: And the privacy settings, it's my understanding that they limit the sharing of that data with other Facebook users, is that correct?

ZUCKERBERG: Senator, yes. Every person gets to control who gets to see their content.

FISCHER: And does that also limit the ability for Facebook to collect and use it?

ZUCKERBERG: Senator, yes. There are other — there are controls that determine what Facebook can do as well. So for example, people have a control about face recognition. If people don't want us to be able to help identify when they are in photos that their friends upload, then they can turn that off.

FISCHER: Right.

ZUCKERBERG: And then we won't store that kind of template for them.

FISCHER: And — and there was some action taken by the FTC in 2011. And you wrote a Facebook post at the time on a public page on the Internet that it used to seem scary to people, but as long as they could make the page private, they felt safe sharing with their friends online; control was key. And you just mentioned control. Senator Hatch asked you a question and you responded there about complete control.

So you and your company have used that term repeatedly, and I believe you use it to reassure users, is that correct? That you do have control and complete control over this information?

ZUCKERBERG: Well, senator, this is how the service works. I mean, the core thing that Facebook is, and all of our services, WhatsApp, Instagram, Messenger.

FISCHER: So is this — is then a question of Facebook is about feeling safe, or are users actually safe? Is Facebook — is Facebook being safe?

ZUCKERBERG: Senator, I think Facebook is safe. I use it, my family uses it, and all the people I love and care about use it all the time. These controls are not just to make people feel safe; it's actually what people want in the product. The reality is, is that when you — just think about how you use this yourself. You don't want to share it — if you take a photo, you're not always going to send that to the same people. Sometimes you're going to want to text it to one person. Sometimes you might send it group. I bet you have a page. You'll probably want to put some stuff out there publicly so you can communicate with your constituents.

There are all these different groups of people that someone might want to connect with, and those controls are very important in practice for the operation of the service. Not just to build trust, although I think that the providing people with control, also does that, but actually in order to make it so that people can fulfill their goals of the service.

GRASSLEY: Senator Coons.

FISCHER: Thank you.

SEN. CHRISTOPHER A. COONS (D-DEL): Thank you, Chairman Grassley. Thank you, Mr. Zuckerberg, for joining us today.

I think the whole reason we're having this hearing is because of a tension between two basic principles you have laid out. First you've said about the data that users post on Facebook. You control and own the data that you put on Facebook. You said some very positive, optimistic things about privacy and data ownership. But it's also the reality that Facebook is a for-profit entity that generated $40 billion in ad revenue last year by targeting ads.

In fact, Facebook claims that advertising makes it easy to find the right people, capture their attention and get results and you recognize that an ad-supported service is, as you said earlier today, best aligned with your mission and values.

But the reality is, there's a lot of examples where ad targeting has led to results that I think we would all disagree with or dislike or would concern us. You've already admitted that Facebook's own ad tools allow Russians to target users, voters based on racist or anti-Muslim or anti-immigrant views, and that that may have played a significant role in election here in United States.

Just today, Time magazine posted a story saying that wildlife traffickers are continuing to use Facebook tools to advertise illegal sales of protected animal parts, and I am left questioning whether your ad-targeting schools would allow other concerning practices like diet pill manufacturers targeting teenagers who are struggling with their weight, or allowing a liquor distributor to target alcoholics or a gambling organization to target those with gambling problems.

I'll give you one concrete example I'm sure you are familiar with: ProPublica back in 2016 highlighted that Facebook lets advertisers exclude users by race in real estate advertising. There was a way that you could say that this particular ad, I only want to be seen by white folks, not by people of color, and that clearly violates fair-housing laws and our basic sense of fairness in the United States. And you promptly announced that that was a bad idea, you were going to change the tools, and that you would build a new system to spot and reject discriminatory ads that violate our commitment to fair housing.

COONS: And yet a year later, a follow-up story by ProPublica said that those changes hadn't fully been made; it was still possible to target housing advertisement in a way that was racially discriminatory.

And my concern is that this practice of making bold and — and engaging promises about changes and practices, and then the reality of how Facebook has operated in the real world, are in persistent tension.

Several different senators have asked earlier today about the 2011 FTC consent decree that required Facebook to better protect users' privacy.

And there are a whole series of examples where there have been things brought to your attention, where Facebook has apologized and has said we're going to change our practices and our policies. And yet, there doesn't seem to have been as much follow up as would be called for.

At the end of the day, policies aren't worth the paper they're written on if Facebook doesn't enforce them.

And I'll close with a question that's really rooted in an experience I had today, as an avid Facebook user. I woke up this morning and was notified by a whole group of friends across the country, asking if I had a new family, or if there was a fake Facebook post of Chris Coons?

I went to the one they suggested. It had a different middle initial than mine. And there's my picture with Senator Dan Sullivan's family; same schools I went to, but a whole lot of Russian friends. Dan Sullivan's got a very attractive family, by the way.

SULLIVAN: Keep that for the record there, Mr. Chairman.

(LAUGHTER)

COONS: The friends who brought this to my attention included people I went to law school with in Hawaii and our own attorney general in the state of Delaware.

And fortunately I've got, you know, great folks who work in my office. I brought it to their attention. They pushed Facebook and it was taken down by midday.

But I'm left worried about what happens to Delawareans who don't have these resources. It's still possible to find Russian trolls operating on the platform. Hate groups thrive in some areas of Facebook, even though your policies prohibit hate speech, and you've taken strong steps against extremism and terrorists.

But is a Delawarean who's not in the Senate going to get the same sort of quick response? I've already gotten input from other friends who say they've had trouble getting a positive response when they've brought to Facebook's attention a page that's, frankly, clearly violating your basic principals.

My core question is isn't it Facebook's job to better protect its users? And why do you shift the burden to users to flag inappropriate content and make sure it's taken down?

ZUCKERBERG: Senator, there are a number of important points in there. And I think it's clear that this is an area, content policy enforcement, that we need to do a lot better on over time.

The history of how we got here is we started off in my dorm room with not a lot of resources and not having the A.I. technology to be able to proactively identify a lot of this stuff.

So just because of the sheer volume of content, the main way that this works today is that people report things to us and then we have our team review that.

And as I said before, by the end of this year, we're going to have more than 20,000 people at the company working on security and content review, because this is important.

Over time, we're going to shift increasingly to a method where more of this content is flagged up front by A.I. tools that we develop.

We've prioritized the most important types of content that we can build A.I. tools for today, like terror related content, where I mentioned earlier that our systems that we deploy; we are taking down 99 percent of the ISIS and Al Qaida-related content that we take down before a person even flags them to us.

If we fast forward 5 or 10 years, I think we're going to have more A.I. technology that can do that in more areas. And I think we need to get there as soon as possible, which is why we're investing in it.

GRASSLEY: Senator Sasse.

COONS: I couldn't agree more. I just think we can't wait five years ...

GRASSLEY: Senator ...

COONS: ... to get housing discrimination and personally offensive material out of Facebook.

ZUCKERBERG: I agree.

GRASSLEY: Senator Sasse?

SASSE: Thank you, Mr. Chairman.

Mr. Zuckerberg, thanks for being here. At current pace, you're due to be done with the first round of questioning by about 1:00 a.m., so congratulations.

I — I like Chris Coons a lot, with his own family, or with Dan Sullivan's family. Both are great photos. But I want to ask a similar set of questions from the other side, maybe.

I think the line — the conceptual line between mirror-tech company, mirror tools, and an actual content company, I think it's really hard. I think you guys have a hard challenge. I think regulation over time will have a hard challenge. And you're a private company so you can make policies that may be less than First Amendment full spirit embracing in my view. But I worry about that. I worry about a world where when you go from violent groups to hate speech in a hurry — and one of your responses to the opening questions, you may decide, or Facebook may decide, it needs to police a whole bunch of speech, that I think America might be better off not having policed by one company that has a really big and powerful platform.

Can you define hate speech?

ZUCKERBERG: Senator, I think that this is a really hard question. And I think it's one of the reasons why we struggle with it. There are certain definitions that — that we — that we have around, you know, calling for violence or ...

SASSE: Let's just agree on that.

ZUCKERBERG: Yes.

SASSE: If somebody's calling for violence, we — that shouldn't be there. I'm worried about the psychological categories around speech. You used language of safety and protection earlier. We see this happening on college campuses all across the country. It's dangerous. Forty percent of Americans under age 35 tell pollsters they think the First Amendment is dangerous because you might use your freedom to say something that hurts somebody else's feelings.

Guess what? There are some really passionately held views about the abortion issue on this panel today. Can you imagine a world where you might decide that pro-lifers are prohibited from speaking about their abortion views on your content — on your platform?

ZUCKERBERG: I certainly would not want that to be the case.

SASSE: But it might really be unsettling to people who've had an abortion to have an open debate about that, wouldn't it?

ZUCKERBERG: It might be, but I don't think that that would — would fit any of the definitions of — of what we have. But I do generally agree with the point that you're making, which is as we — as we're able to technologically shift towards especially having A.I. proactively look at content, I think that that's going to create massive questions for society about what obligations we want to require companies to — to fulfill. And I do think that that's a question that we need to struggle with as a country, because I know other countries are, and they're putting laws in place. And I think that America needs to figure out and create the set of principles that we want American companies to operate under.

SASSE: Thanks. I wouldn't want you to leave here today and think there's sort of a unified view in the Congress that you should be moving toward policing more and more and more speech. I think violence has no place on your platform. Sex traffickers and human traffickers have no place on your platform. But vigorous debates? Adults need to engage in vigorous debates.

I have only a little less than two minutes left, so I'm going to shift gears a little bit. But that was about adults. You're a dad. I'd like to talk a little bit about social media addiction. You started your comments today by talking about how Facebook is and was founded as an optimistic company. You and I have had conversations separate from here. I don't want to put words in your mouth, but I think as you've aged you might be a little bit less idealistic and optimistic than you were when you — when you started Facebook.

As a dad, do you worry about social media addiction as a problem for America's teens?

ZUCKERBERG: Well my hope is — is that we can be idealistic but have a broad view of our responsibility.

To your — your point about teens, this is certainly something that I think any parent thinks about, is how much do you want your kids using technology. It — at Facebook, specifically, I view our responsibility as not just building services that people like, but building services that are good for people and good for society as well.

So we study a lot of effects of well being of our — of our tools and broader technology. And you know, like any tool, there are good and — and bad uses of it.

What we find in general is that if you're using social media in order to build relationships, right? So you're — you're sharing content with friends, you're interacting, then that is associated with all of the long-term measures of well-being that you'd intuitively thing of.

Long-term health, long-term happiness, long-term feeling connected, feeling less lonely. But if you're using the Internet and social media primarily to just passively consume content, and you're not engaging with other people, then it doesn't have those positive effects and it could be negative.

SASSE: We're — we're almost at time, so I want to — I want to ask you one more. Do social media companies hire consulting firms to help them figure out how to get more dopamine feedback loops so that people don't want to leave the platform?

ZUCKERBERG: No, Senator. That's not how we talk about this, or — or how we set up our product teams. We want our products to be valuable to people. And if they're valuable, then people choose to use them.

SASSE: Are you aware of other social media companies that do hire such consultants?

ZUCKERBERG: Not sitting here today.

SASSE: Thanks

GRASSLEY: Senator Markey?

MARKEY: Thank you, Mr. Chairman.

In response to Senator Blumenthal's pointed questions, you refused to answer whether Facebook should be required by law to obtain clear permission from users before selling or sharing their personal information.

So I'm going to ask it one more time. Yes or no. Should Facebook get clear permission from users before selling or sharing sensitive information about your health, your finances, your relationships? Should you have to get their permission?

That's, essentially, the consent decree with the Federal Trade Commission that you signed in 2011. Should you have to get permission? Should the consumer have to opt in?

ZUCKERBERG: Senator, we do require permission to use the — the system, and to — to put information in there, and for — for all the uses of it.

I want to be clear. We don't sell information. So regardless of whether we could get permission to do that, that's just not a thing that we're going to go do.

MARKEY: So would you support legislation? I have a bill, Senator Blumenthal referred to it, The Consent Act, that would just put on the books a law that said that Facebook, and any other company that gathers information about Americans, has to get their permission, their affirmative permission, before it can be reused for other purposes.

Would you support that legislation to make it a national standard for not just Facebook, but for all the other companies out there? Some of them, bad actors. Would you support that legislation?

ZUCKERBERG: Senator, I — I — in general, I think that that principle is exactly right. And I think we should have a — a discussion around how to best apply that.

MARKEY: No, would you support legislation to back that general principle, that opt-in, that getting permission is the standard. Would you support legislation to make that the American standard?

Europeans have passed that as a law. Facebook's going to live with that law beginning on May 25th. Would you support that as the law in the United States?

ZUCKERBERG: Senator, as a principle, yes, I would. I think the details matter a lot, and now that ...

MARKEY: Right. But assuming that we work out the details, you do support opt-in as the standard? Getting permission affirmatively as the standard for the United States? Is that correct?

ZUCKERBERG: Senator, I think that that's the right principle. And a hundred billion times a day in our services, when people go to share content, they choose who they want to share it with affirmatively.

MARKEY: So you — you — you could support a law that enshrines that as the promise that we make to the American people, that permission has to be obtained before their information is used. Is that correct?

ZUCKERBERG: Senator, yes. I said that in principle I think that that makes sense, and the details matter and I look forward to having our team work with you on fleshing that out.

MARKEY: Right. So the next subject, because I want to, again I want to make sure that we kind of drill down here. You earlier made reference to the Child Online Privacy Protection Act of 1999, which I am the author of. So that is the constitution for child privacy protection online in the country, and I'm very proud of that. But, there are no protections additionally for a 13, a 14, or a 15-year-old. They get the same protections that a 30-year-old or a 50-year-old get.

So I have a separate piece of legislation to insure that kinds who are under 16 absolutely have a privacy bill of rights, and that permission has to be received from their parents for their children before any of their information is reused for any other purpose other than that which was originally intended. Would you support a child online privacy bill of rights for kids under 16 to guarantee that that information is not reused for any other purpose without explicit permission from the parents for the kids?

ZUCKERBERG: Senator, I think, as a general principle, I think protecting minors and protecting their privacy is extremely important, and we do a number of things on Facebook to do that already, which I am happy to ...

MARKEY: I appreciate that. I'm talking about a law. I'm talking about a law. Would you support a law to insure that kids under 16 have this privacy bill of rights? I had this conversation with you in your office seven years ago, both this specific subject and Palo Alto. And I think that's really what the American people want to know right now: What is the protections of this? What are the protections that are going to be put on the books for their families, but especially for their children? Would you support a privacy bill of rights for kids where opt in is the standard? Yes or no?

ZUCKERBERG: Senator, I think that that's an important principle and ...

MARKEY: I appreciate that.

ZUCKERBERG: ... and I think we should ...

MARKEY: But we need a law to protect those children. That's my question to you. Do you think we need a law to do so? Yes or no?

ZUCKERBERG: Senator, I'm not sure if we need a law, but I think that this is certainly a thing that deserves a lot of discussion.

MARKEY: And again, I couldn't disagree with you more. We're leaving these children to the most rapacious commercial predators in the country will exploit these children unless we absolutely have a law on the books. And I think it's ...

GRASSLEY: Please give a short — please give a short answer.

ZUCKERBERG: Senator, I look forward to having my team follow up to have my team flesh out the details of it.

GRASSLEY: Senator Flake? Senator Flake?

(CROSSTALK)

MARKEY: ... issued to get a correct answer to that.

FLAKE: Thank you, Mr. Chairman.

Thank you, Mr. Zuckerberg. Thanks for enjoying so far, and I'm sorry if I plow old ground; I had to be away for a bit.

I, myself, and Senator Coons, Senator Peters, and a few others were in the country of Zimbabwe just a few days ago. We met with opposition figures who had talked about you know their goal is to be able to have access to state-run media.

FLAKE: In many African countries, many countries around the world, third-world countries, small countries, the only traditional media is state run, and we ask them how they get their message out, and it's through social media. Facebook provides a very valuable service in many countries for opposition leaders or others who simply don't have access, unless maybe just before an election, to traditional media. So that's very valuable, and I think we all recognize that.

On the flip side, we've seen with Rohingya, that example of, you know, where the state could use similar data or use this platform to go after people. You talked about what you're doing in that regard, hiring more, you know, traditional — or, local-language speakers. What else are you doing in that regard to ensure that these states don't — or, these governments go after opposition figures or others?

ZUCKERBERG: Senator, there are three main things that we're doing, in Myanmar specifically, and that will apply to — to other situations like that. The first is hiring enough people to do local language support, because the definition of hate speech or things that can be racially coded to incite violence are very language-specific and we can't do that with just English speakers for people around the world. So we need to grow that.

The second is, in these countries there tend to be active civil society, who can help us identify the figures who are — who are spreading hate. And we can work with them in order to make sure that those figures don't have a place on our platform.

The third is that there are specific product changes that we can make in order to — that — that might be necessary in some countries but not others, including things around news literacy — right.

And, like, encouraging people in — in different countries about, you know, ramping up or down. You know, things that we might do around fact-checking of content, specific product-type things that we would implement in different places. But I think that that's something that we're going to have to do in a number of countries.

FLAKE: There are obviously limits, you know, native speakers that you can hire or people that have eyes on the page. Artificial intelligence is going to have to take the bulk of this. How — how much are you investing in working on — on that tool to — to do what, really, we don't have or can't hire enough people to do?

ZUCKERBERG: Senator, I think you're absolutely right that over the long term, building A.I. tools is going to be the scalable way to identify and root out most of this harmful content. We're investing a lot in doing that, as well as scaling up the number of people who are doing content review.

One of the things that I've mentioned is this year we're — or, in the last year, we've basically doubled the number of people doing security and content review. We're going to have more than 20,000 people working on security and content review by the end of this year. So it's going to be coupling continuing to grow the people who are doing review in these places with building A.I. tools, which is — we're — we're working as quickly as we can on that, but some of this stuff is just hard. That, I think, is going to help us get to a better place on eliminating more of this harmful content.

FLAKE: Thank you. You've talked some about this, I know, do you believe that Russian and/or Chinese governments have harvested Facebook data and have detailed data sets on Facebook users? Has your forensic analysis shown you who else, other than Cambridge Analytica, downloaded this kind of data?

ZUCKERBERG: Senator, we have kicked-off an investigation of every app that had access to a large amount of people's data before we locked down the platform in 2014. That's underway, I imagine we'll find some things, and we are committed to telling the people who were affected when we do. I don't think, sitting here today, that we have specific knowledge of — of other efforts by — by those nation-states. But, in general, we assume that a number of countries are trying to abuse our systems.

FLAKE: Thank you.

Thank you, Mr. Chairman.

GRASSLEY: (Inaudible) person is Senator Hirono.

HIRONO: Thank you, Mr. Chairman.

Mr. Zuckerberg, the U.S. Immigration and Customs Enforcement has proposed a new extreme vetting imitative which they have renamed VISA Life Cycle vetting, that sounds less scary.

They have already held an industry that they advertised on the federal contracting website to get input from tech companies on the best way to, among other things, and I'm quoting ICE, “exploit publicly available information, such as media, blogs, public hearings, conferences, academic websites, social media websites such as Facebook, Twitter and LinkedIn to extract pertinent information regarding targets.”

And basically what they — what they want to do with these targets is to determine, and again, I'm quoting ICE's own document, they want — ICE has been directed to develop processes that determine and evaluate an applicant, i.e. targets probability of becoming a positively contributing member of society as well as their ability to contribute to national interest in order to meet the executive order. That is the president's executive order.

And then ICE must also develop a mechanism or methodology that allows them to assess whether an applicant intends to commit criminal or terrorists acts after entering the United States. Question to you is, does Facebook plan to cooperate with this extreme vetting initiative, and help the Trump administration target people for deportation or other ICE enforcement?

ZUCKERBERG: Senator, I don't know that we've had specific conversations around that. In general ...

HIRONO: If you were asked to provide or cooperate with ICE so that they could determine whether somebody is going to commit a crime, for example, or become fruitful members of our society, would you cooperate?

ZUCKERBERG: We would not proactively do that. We cooperate with law enforcement in two cases. One is if we become aware of an imminent threat of harm, then we will proactively reach out to law enforcement, as we believe is our responsibility to do.

The other is when law enforcement reaches out to us with a valid legal subpoena or — or request for data. In those cases, if their request is overly broad or we believe it's not a legal request, then we're going to push back aggressively.

HIRONO: Well, let's assume that ICE doesn't have a — a — there's no law or rule that requires that Facebook cooperate to allow them to get this kind of information so that they can make those kinds of assessments, it sounds to me as though you would decline?

ZUCKERBERG: Senator, that is correct.

HIRONO: Is there some way that — well, I know that you determine what kind of content would be deemed harmful, so do you believe that ICE can even do what they are talking about?

Namely, through a combination of various kinds of information including information that they would hope to obtain from entities such yours, predict who would commit crimes or present a national security problem. Do you think that — that that's even doable?

ZUCKERBERG: Senator, I'm not familiar enough with what they're doing to offer an informed opinion on that.

HIRONO: Well you have to make assessments as to what constitutes hate speech. That's pretty hard to do. You have to assess what election interference is. So these are rather difficult to identify, but wouldn't the — try to predict whether somebody's going to commit a crime fit into the category of pretty difficult to assess?

ZUCKERBERG: Senator, it sounds difficult to me. All of these things, like you're saying, are difficult. I don't know without having worked on it or thinking about it ...

(CROSSTALK)

HIRONO: I think common sense would tell us that that's pretty difficult. And yet, that's what ICE is proceeding to do. You were asked about discriminatory advertising, and in February of 2017, Facebook announced that it would no longer allow certain kinds of ads that discriminated on the basis of race, gender, family status, sexual orientation, disability, or veteran status, all categories prohibited by federal law and housing. And yet, after 2017, it was discovered that you could in fact place those kinds of ads.

So what is the status of whether or not these ads can currently be placed on Facebook? And have you followed through on your February 2017 promise to address this problem? And is there a way for the public to verify that you have, or are — are we just expected to trust that you've done this?

ZUCKERBERG: Senator, those — those are all important questions, and in general it is against our policies to — to have any ads that are discriminatory. Some of ...
HIRONO:

Well, you said that you wouldn't allow it, but then — was it ProPublica — could place these ads even after you said you would no longer allow these kinds of ads. So what assurance do we have from you that this is stop — going to stop?

ZUCKERBERG: Well, two things. One is that we've removed the ability to exclude ethnic groups and other sensitive categories from ad targeting. So that just isn't a feature that's even available anymore. For some of these cases, where it may make sense to target proactively a group, the enforcement today is — is still — we review ads, we screen them up front, but most of the enforcement today is still that our community flags issues for us when they come up.

So if the community flags that issue for us, then our team, which has thousands of people working on it, should take it down. We'll make some mistakes, but we try to make as few as possible. Over time, I think the strategy would be to develop more A.I. tools that can more proactively identify those types of content and do that filtering up front.

(CROSSTALK)

HIRONO: So it's a work in progress.

ZUCKERBERG: Yes.

THUNE: Thank you. Thank you, Senator Hirono. Senator Sullivan's up next.

HIRONO: Thank you.

SULLIVAN: Thank you, Mr. Chairman. And Mr. Zuckerberg, quite a story, right? Dorm room to the global behemoth that you guys are. Only in America, would you agree with that?

ZUCKERBERG: Senator, mostly in America.

SULLIVAN: You couldn't — you couldn't do this in China, right? Or, what you did in 10 years.

ZUCKERBERG: Well — well, senator, there are — there are some very strong Chinese Internet companies.

SULLIVAN: Right but — you're supposed to answer “yes” to this question.

(LAUGHTER)

Okay, come on, I'm trying to help you, right?

(CROSSTALK)

THUNE: This is — this is the softball.

SULLIVAN: I mean, give me a break. You're in front of a bunch of — the answer is “yes,” okay, so thank you.

(LAUGHTER)

Now, your — your testimony — you have talked about a lot of power — you've been involved in elections. I thought your — your testimony was very interesting. All — really all over the world, the Facebook — 2 billion users, over 200 million Americans, 40 billion in revenue. I believe you and Google have almost 75 percent of the digital advertising in the U.S.

Is — one of the key issues here, is Facebook too powerful? Are you too powerful? And do you think you're too powerful?

ZUCKERBERG: Well, senator, I think most of the time when people talk about our scale, they're referencing that we have two billion people in our community. And I think one of the big questions that we need to think through here is the vast majority of those 2 billion people are outside of the U.S. And I think that that's something that, to your point, that Americans should be proud of.

(CROSSTALK)

ZUCKERBERG: And when I brought up the Chinese Internet companies, I think that that's a real — a real strategic and competitive threat that, in American technology policy we (inaudible) should be thinking about.

(CROSSTALK)

SULLIVAN: Let me ask you another point here real quick.

I — I want to — I — I don't want to interrupt, but you know, when you look at kind of the history of this country and you look at the history of these kind of hearings, right. You're a smart guy. You read a lot of history. When companies become big and powerful and accumulate a lot of wealth and power, what typically happens from this body is there's an — there is a instinct to either regulate or break up, right.

Look at the history of this nation. You have any thoughts on those two policy approaches?

ZUCKERBERG: Well, senator, I'm not the type of person that thinks that all regulation is bad. So I think the Internet is becoming increasingly important in people's lives, and I think we need to have a full conversation about what is the right regulation, not whether it should be or shouldn't be.

SULLIVAN: Let me — let me talk about the tension there, because I — I think it's a good point and I appreciate you mentioning that. You know, my — one of my worries on regulation, again, with a company of your size, you're saying, hey, we might be interested in being regulated. But as you know regulations can also cement the dominant power. So what do I mean by that? You know, you have a lot of lobbyists, I think every lobbyist in town is involved in this hearing in some way or another, a lot of powerful interests. You look at what happened with Dodd-Frank. That was supposed to be aimed at the big banks. The regulations ended up empowering the big banks in keeping the small banks down.

Do you think that that's a risk given your influence, that if we regulate, we're actually going to regulate into — you into a position of cemented authority when one of my biggest concerns about what you guys are doing is that the next Facebook — which we all want, the guy in the dorm room. We all want that to start it — that you are becoming so dominant that we're not able to have that next Facebook. What — what — what are your views on that?

ZUCKERBERG: Well, senator, I agree with the point that when you're thinking through regulation, across all industries, you need to be careful that it doesn't cement in the current companies that are — that are winning.

SULLIVAN: But would you try to do that? Isn't that the normal inclination of a company, to say, hey, I'm going to hire the best guys in town and I'm going to cement in an advantage. You wouldn't do that if we were regulating you.

ZUCKERBERG: Senator, that — that certainly wouldn't be our approach. But — but I think — I think part of the challenge with regulation in general is that when you add more rules that companies need to follow, that's something that a larger company like ours inherently just has the resources to go do, and that might just be harder for a smaller company getting started to be able to comply with.

SULLIVAN: Correct.

ZUCKERBERG: So it's not something that — like going into this, I would look at the conversation as what is the right outcome. I think there are real challenges that we face around content and privacy and in a number of areas, ads transparency, elections ...

SULLIVAN: Let me — let me get — I'm sorry to interrupt, but let me get to one final question. It kind of relates to what you're talking about in terms of content regulation and what exactly — what exactly Facebook is.

You know, you — you mention you're a tech company, a platform, but there's some who are saying that you're the world's biggest publisher. I think about 140 million Americans get their news from Facebook, and when you talk to — when you mentioned that Senator Cornyn — Cornyn, he — you said you are responsible for your content.

So which are you, are you a tech company or are you the world's largest publisher, because I think that goes to a really important question on what form of regulation or government action, if any, we would take.

ZUCKERBERG: Senator, this is a — a really big question. I — I view us as a tech company because the primary thing that we do is build technology and products.

SULLIVAN: But you said you're responsible for your content, which makes ...

ZUCKERBERG: Exactly.

SULLIVAN: ... you kind of a publisher, right?

ZUCKERBERG: Well, I agree that we're responsible for the content, but we don't produce the content. I — I think that when people ask us if we're a media company or a publisher, my understanding of what — the heart of what they're really getting at, is do we feel responsibility for the content on our platform.

The answer to that, I think, is clearly “yes.” And — but I don't think that that's incompatible with fundamentally, at our core, being a technology company where the main thing that we do is have engineers and build products.

THUNE: Thank you, Senator Sullivan.

Senator Udall?

UDALL: Thank you, Mr. Chairman. And thank you very much, Mr. Zuckerberg, for being here today. You — you spoke very idealistically about your company, and you talked about the strong values, and you said you wanted to be a positive force in the community and the world.

And you were hijacked by Cambridge Analytica for political purposes. Are you angry about that?

ZUCKERBERG: Absolutely.

UDALL: And — and you're determined — and I assume you want changes made in the law? That's what you've talked about today.

ZUCKERBERG: Senator, the most important thing that I care about right now is making sure that no one interferes in the various 2018 elections around the world.

We have an extremely important U.S. midterm. We have major elections in India, Brazil, Mexico, Pakistan, Hungary coming up. And we're going to take a — a number of measures, from building and deploying new A.I. tools that take down fake news, to growing our security team to more than 20,000 people, to making it so that we verify every advertiser who's doing political and issue ads, to make sure that that kind of interference that the Russians were able to do in 2016 is going to be much harder for anyone to pull off in the future.

UDALL: And — and I think you've said earlier that you support the Honest Ads Act, and so I assume that means you want changes in the law in order to — to effectuate exactly what you talked about?

ZUCKERBERG: Senator, yes.

UDALL: Yeah, yeah.

ZUCKERBERG: We support the Honest Ads Act. We're implementing it.

UDALL: And so are you going to — are you going to come back up here and be a strong advocate, to see that that law is passed?

ZUCKERBERG: Senator, the biggest thing that I think we can do is implement it. And we're doing that.

UDALL: That's a kind of yes-or-no question, there. I hate to interrupt you, but are you going to come back and be a strong advocate? You're angry about this. You think there ought to be change. There ought to be a law put in place. Are you going to come back and be an advocate, to get a law in place like that?

ZUCKERBERG: Senator, our team is certainly going to work on this. What I can say is, the biggest thing that ...

(CROSSTALK)

UDALL: I'm talking about you, not your team.

ZUCKERBERG: Well, Senator, I try ...

(CROSSTALK)

UDALL: (inaudible) come back here and be ...

ZUCKERBERG: ... not to come to D.C.

UDALL: ... an advocate for that law? That's what I want to see. I mean, you're upset about this. We're upset about this. I — I'd like a yes-or-no answer on that one.

ZUCKERBERG: Senator, I'm — I'm posting and speaking out publicly about how important this is. I don't come to Washington, D.C., too often. I'm going to direct my team to focus on this. And the biggest thing that I feel like we can do is implement it, which we're doing.

UDALL: Well, the biggest thing you can do is to be a strong advocate yourself, personally, here in Washington. Just let me make that clear. But many of us have seen the kinds of images shown earlier by Senator Leahy. You saw those images that he held up.

Can you guarantee that any of those images that can be attributed or associated with the Russian company, Internet Research Agency, have been purged from your platform?

ZUCKERBERG: Senator, no, I can't guarantee that. Because this is an ongoing arms race. As long as there are people sitting in Russia whose job it is, is to try to interfere with elections around the world, this is going to be an ongoing conflict.

What I can commit is that we're going to invest significantly. because this is a top priority, to make sure that people aren't spreading misinformation or trying to interfere in elections on Facebook.

But I don't think it would be a realistic expectation, to assume that as long as there are people who are employed in Russia, for whom this is their job, that we're going to have zero amount of that, or that we're going to be 100 percent successful at preventing that.

UDALL: Now, beyond disclosure of online ads, what specific steps are you taking to ensure that foreign money is not financing political or issue ads on Facebook in violation of U.S. law? Just because someone submits a disclosure that says paid for by some 501(c)(3) or PAC, if that group has no real person in the U.S., how can we ensure it is not foreign — foreign interference?

ZUCKERBERG: Senator, our verification program involves two pieces. One is verifying the identity of the person who's buying the ads, that they have a valid government identity. The second is verifying their location. So if you're sitting in Russia, for example, and you say that you're in the U.S., then we'll be able to — to make it a lot harder to do that, because what we're actually going to do is mail a code to the address that you say you're at.

And if you can't get access to that code, then you're not going to be able to run ads.

UDALL: Yes. Now, Facebook is creating an independent group to study the abuse of social media in elections. You've talked about that. Will you commit that all findings of this group are made public no matter what they say about Facebook or it's business model? Yes or no answer.

ZUCKERBERG: Senator, that's the purpose of this group, is that Facebook does not get to control what these folks publish. These are going to be independent academics, and Facebook has no prior publishing control. They'll be able to do the studies that — that — that they're doing and publish the results.

UDALL: And you're fine with them being public? And what's the timing on getting those out?

ZUCKERBERG: Senator, we're — we're kicking off the research now. Our goal is to focus on both providing ideas for preventing interference in 2018 and beyond, and also for holding us accountable to making sure that the measures that we put in place are successful in doing that. So I would hope that we will start to see the first results later this year.

UDALL: Thank you, Mr. Chairman.

THUNE: Thank you, Senator Udall.

Senator Moran is up next, and I would just say again, for the benefit of those who are here, that after a couple of more questions, we'll probably give the witness another short break.

ZUCKERBERG: Thank you.

THUNE: So we're — we're getting about almost two thirds through the — the list of members who are here to ask questions.

Senator Moran.

MORAN: Mr. Chairman, thank you. Mr. Zuckerberg, thank you for your — I'm over here. Thank you for your testimony and thank you for your presence here today. On March the 26th of this year, the FTC confirmed that it was investigating Facebook to determine whether it's privacy practices violated the FTC Act or the consent order that Facebook entered into with the agency in 2011.

I chair the Commerce committee — subcommittee that has jurisdiction over the Federal Trade Commission. I remain interested in Facebook's assertion that it rejects any suggestion of violating that consent order. Part two of that consent order requires that Facebook, quote, “clearly and prominently” display notice and obtain users' affirmative consent before sharing their information with, quote, “any third party.?

My question is how does the case of approximately 87 million Facebook friends having their data shared with a third party due to the consent of only 300,000 consenting users not violate that agreement?

ZUCKERBERG: Well, Senator, like I said earlier, I mean our view is that — is that we believe that we are in compliance with the consent order, but I think we have a broader responsibility to protect people's privacy even beyond that. And in this specific case, the way that the platform worked, that you could sign into an app and bring some of your information and some of your friends' information is how we explained it would work. People had settings to that effect. We explained and — and they consented to — to it working that way. And the — the system basically worked as it was designed.

The issue is that we designed the system in a way that wasn't good. And now we — starting in 2014, have changed the design of the system to that that way it just massively restricts the amount of — of data access that a developer could get.

(CROSSTALK)

MORAN: The — I'm sorry, the 300,000 people, they were treated in a way that — it was appropriate; they consented. But you're not suggesting that the friends consented?

ZUCKERBERG: Senator, I believe that — that we rolled out this developer platform, and that we explained to people how it worked, and that they did consent to it. It — it makes, I think, to — to go through the way the platform works. I mean, it's — in 2007, we — we announced the Facebook developer platform, and the idea was that you wanted to make more experiences social, right?

So, for example, if you — like, you might want to have a calendar that can have your friends' birthdays on it, or you might want your address book to have your friends' pictures in it, or you might want a map that can show your friends' addresses on it. In order to do that, we needed to build a tool that allowed people to sign in to an app and bring some of their information, and some of their friends' information, to those apps. We made it very clear that this is how it worked, and — and when people signed up for Facebook, they signed up for that as well.

Now, a lot of good use cases came from that. I mean, there were games that were built. There were integrations with companies that, I think, we're familiar with, like Netflix and Spotify. But over time, what became clear was that that also enabled some abuse. And that's why in 2014, we took the step of changing the platform. So now, when people sign in to an app, you do not bring some of your friends' information with you. You're only bringing your own information and you're able to connect with friends who have also authorized that app directly.

MORAN: Let me turn to the bug — your Bug Bounty program. Our subcommittee has had hearings in — a hearing in regard to Bug Bounty. Your press release indicated that was one of the six changes that Facebook initially offered to crack down on platform abuses was to reward outside parties who find vulnerabilities.

One concern I have regarding the utility of this approach is that the vulnerability disclosure programs are normally geared toward identifying unauthorized access to data, not pointing out data-sharing arrangement that likely could harm someone, but technically they abide by complex consent agreements. How do you see the Bug Bounty program that you've announced addressing the issue of that?

ZUCKERBERG: Sorry, could you — could you clarify what — what specifically ...

MORAN: How do you — how do you see that the Bug Bounty program that you are — have announced will deal with the sharing of information not permissible, as compared to just unauthorized access to data?

ZUCKERBERG: Senator, I'm not — I'm not too sure I — I understand this enough to — to speak to — to that specific point, and I can have my team follow up with you on the details of that.

In general, bounty programs are an important part of the security arsenal for hardening a lot of systems. I — I think we should expect that we're going to invest a lot in hardening our systems ourselves, and that we're going to audit and investigate a lot of the folks in our ecosystem.

But even with that, having the ability to enlist other third-parties outside of the company to be able to help us out by giving them an incentive to point out when they see issues, I think is likely going to help us improve the security of the platform overall, which is why we did this.

MORAN: Thank you, Mr. Zuckerberg.

THUNE: Thank you, Senator Moran.

Next up is Senator Booker.

BOOKER: Thank you, Mr. Chairman.

Hello, Mr. Zuckerberg. As you know, much of my life has been focused on low-income communities, poor communities, working-class communities, and trying to make sure they have a fair shake. This country has a very bad history of discriminatory practices towards low-income Americans and Americans of color, from the redlining FHA practices, even to more recently really just discriminatory practices in the mortgage business. I've always seen technology as a promise to democratize our nation, expand access, expand opportunities.

But unfortunately, we've also seen how platforms, technology platforms like Facebook, can actually be used to double down on discrimination and — and give people more sophisticated tools with which to discriminate.

Now in — in 19 — in 2000 — in 2016, ProPublica revealed that advertisers could use ethnic affinity, a users race to market categories to potentially discriminate overall against Facebook users in the areas of housing, employment and credit, echoing a dark history in this country, and — and also in violation of federal law.

In 2016, Facebook committed to fixing this, that the advertisers who have access to this data, to fixing it. But unfortunately a year later as — as — as ProPublica's article showed, they found that the system Facebook built was still allowing housing ads without applying — to go forward without applying these new restrictions that were put on.

Facebook then opted in a system that's very similar to what we've been talking about with Cambridge Analytica, that they could self certify that they were not engaging in these practices and complying with federal law, using this self certification away and — and — to — to overcome and to comply with rather Facebook's anti-discrimination policy.

Unfortunately, in a recent lawsuit, as of February 2018, alleges that discriminatory ads were still being created on Facebook, still disproportionately impacting low-income communities and communities of color.

Given the fact that you allow Cambridge Analytica to self certify in a way that I think — at least I think you've expressed regret over, is self certification the best and strongest way to safeguard — guard against the misuse of your platform and protect the data of users, not let it be manipulated in such a discriminatory fashion.

ZUCKERBERG: Senator, this is a — a — a very important question and, in general, I think over time we're going to move towards more proactive review, with more A.I. tools to help flag problematic content.

In the near term, we have a lot of content on the platform, and we — it's — it's hard to review every single thing up front. We do a quick screen. But I — I agree with you that I think in — in this specific case, I'm not happy with where we are, and I — I think it makes sense to — to really focus on making sure that these areas get more reviews sooner.

BOOKER: And I — and I know you understand that there is a growing distrust and I know a lot of civil rights organizations have met with you about Facebook's sense of urgency to address these issues.

There's a distrust that stems from the fact and I know — I've had conversations with leaders in Facebook about the lack of diversity in the tech sector as well, people who are writing these algorithms, people who are actually policing for this data, or policing for these problems, are they going to be a part of a more diverse group that's looking at this? You're looking to hire, as you said, 5,000 new positions for among other things reviewing content, but we know in your industry, the inclusivity, it — it's a real serious problem that you are an industry that lacks diversity in a very dramatic fashion. It's not just true with Facebook; it's true with the tech area as well. And — and so it's very important for me to — to communicate that larger sense of urgency, and — and what a lot of civil rights organizations are concerned with, and — and we should be working towards more — a more collaborative approach.

BOOKER: And I'm wondering if you'd be open to opening your platform for civil rights organizations to really audit a lot of these companies dealing in areas of credit and housing, to really audit what is actually happening and better have more transparency in working with your platform.

ZUCKERBERG: Senator, I think that's a very good idea. And I think we should follow up on the details of that.

BOOKER: I also want to say that — that there was an investigation. Something's very disturbing to me, is the fact that there have been law enforcement organizations that use Facebook's platform to — to — to surveil African American organizations like Black Lives Matter.

I know you've expressed support for the group, and Philando Castile's killing was broadcast live on Facebook. But there are a lot of communities of color worried that that data can be used to surveil groups like Black Lives Matter, like folks who are trying to organize against substantive issues of discrimination in this country.

Is this something that you're committed to addressing, and to ensuring that the freedoms that civil rights activists and others are not targeted, or their work not being undermined or people not using your platform to unfairly surveil and try to undermine the activities that those groups are doing?

ZUCKERBERG: Yes, Senator. I think that that's very important. We're — we're committed to that.

And in general, unless law enforcement has a very clear subpoena or ability or — or reason to get access to information, we're going to push back on that across the board.

BOOKER: And then I'd just like, for the record — my time has expired ...

GRASSLEY: Yeah.

BOOKER: ... but there's a lawsuit against Facebook about discrimination. And you moved for the lawsuit to be dismissed because no harm was shown. Could you please submit to the record — do you believe that people of color were not recruited for various economic opportunities are being harmed? Can you please clarify why you moved for — to dismiss that lawsuit, for the record?

GRASSLEY: For the record.

Senator Heller's up next.

I'll go to you.

HELLER: All right, Mr. Chairman. Thank you.

Appreciate the time, and thank you for being here. I'm over here. Thanks. And thank you for taking time. I know it's been a long day, and I think you're at the — at the final stretch, here. But I'm glad that you are here.

Yesterday Facebook sent out a notification to 87 million users that information was given to Cambridge Analytica without their consent. My daughter was one of the 87 million, and six of my staff, all from Nevada, received this notification.

Can you tell me how many Nevadans were among the 87 million that received this notification?

ZUCKERBERG: Senator, I don't have this broken out by state right now. But I can have my team follow up with you to get you the information.

HELLER: Okay, okay. I figured that would be the answer. If, after hearing this — going through this hearing and Nevadans no longer want to have a Facebook account, if — if that's the case, if a Facebook user deletes their account, do you delete their data?

ZUCKERBERG: Yes.

HELLER: My kids have been on Facebook and Instagram for years. How long do you keep a user's data?

ZUCKERBERG: Sorry, can ...

HELLER: How long do you keep a user's data, once they — after — after they've left? If they — if they choose to delete their account, how long do you keep their data?

ZUCKERBERG: I don't know the answer to that off the top of my head. I know we try to delete it as quickly as is reasonable. We have a lot of complex systems, and it work — takes awhile to work through all that.

But I think we try to move as quickly as possible, and I can follow up or have my team follow up ...

HELLER: Yeah.

ZUCKERBERG: ... to get you the — the data on that.

HELLER: Okay. Have you ever said that you won't sell an ad based on personal information? Simply that — that you wouldn't sell this data because the usage of it goes too far?

ZUCKERBERG: Senator, could you clarify that?

HELLER: Have you ever drawn the line on selling data to an advertiser?

ZUCKERBERG: Yes, senator. We don't sell data at all.

So the — the way the ad system work is advertisers can come to us and say, I — I have a message that I'm trying to reach a certain type of people. They might be interested in something, they might live in a place, and then we help them get that message in front of people. But this is one of the — it's — it's widely mischaracterized about our system that we sell data. And it's actually one of the most important parts of how Facebook works is that we do not sell data. Advertisers do not get access to people's individual data.

HELLER: Have you ever collected the content of phone calls or messages through any Facebook application or service?

ZUCKERBERG: Senator, I don't believe we have ever collected the content of — of phone calls. We have an app called Messenger that allows people to message most of their Facebook friends. And we do on — in the Android operating system allow people to use that app as their client for both Facebook messages and texts. So we do allow people to import their texts into that.

HELLER: Okay. Let me ask you about government surveillance. For years Facebook said that there'd be — that there should be strict limits of the information the government can access on Americans. And by the way, I agreed with you that privacy — because privacy is important to Nevadans. You argue that Facebook users wouldn't trust you if they thought you were giving their private information to the intelligence community. Yet you use and sell the same data to make money. And in the case of Cambridge Analytica, you don't even know how it's used after you sell it. Can you tell us why this isn't hypocritical?

ZUCKERBERG: Well senator, once again, we don't sell any data to anyone. We don't see it to advertisers, and we don't sell it to developers. What we do allow is for people to sign in to apps and bring their data and it used to be the data of some of their friends but now it isn't with them. And that I think makes sense. I mean, that's basic data portability. The ability that you own the data, you should be able to take it from one app to another if you'd like.

HELLER: Do you believe you're more responsible with millions of American's personal data than the Federal government would be?

ZUCKERBERG: Yes. But, senator, the — your point about surveillance, I think that there's a very important distinction to draw here, which is that when — when organizations do surveillance people don't have control over that. But on Facebook, everything that you share there you have control over. You can — you can say I don't want this information to be there. You have full access to understand all, every piece of information that Facebook might know about you, and you can get rid of all of it. And I — I don't know of any other — any surveillance organization in the world that operates that way, which is why I think that that comparison isn't really apt here.

HELLER: With you here today, do you think you're a victim?

ZUCKERBERG: No.

HELLER: Do you think Facebook as a company is a victim?

ZUCKERBERG: Senator, no. I think that we have a responsibility to protect everyone in our community from anyone in — in our ecosystem who is going to potentially harm them. And I think that we haven't done enough historically ...

HELLER: Do you consider ...

ZUCKERBERG: ... and we need to step up and do more.

HELLER: Do you consider the 87 million users — do you consider them victims?

ZUCKERBERG: Senator, I think yes. I mean, they — they did not their information to be sold to Cambridge Analytica by a developer. And — and that happened, and it happened on our watch. So even though we didn't do it, I think we have a responsibility to be able to prevent that and be able to take action sooner. And we're committing to make that we do that going forward.

ZUCKERBERG: Which is why the steps that I — that I announced before are now, they're the two most important things that we're doing are locking down the platform to make sure that developers can't get access to that much data so this can't happen again going forward, which I think is largely the case since 2014, and going backwards we need to investigate every single app that might have had access to a large amount of people's data to make sure that no one else was misusing it. If we find that they are, we're going to get into their systems, do a full audit, make sure they delete it and we're going to tell everyone who's affected.

HELLER: Mr. Chairman, thank you.

THUNE: Thank you, Senator Heller. We'll go to Senator Peters and then into the break and then Senator Tillis coming out of the break. So Senator Peters.

PETERS: Thank you, Mr. Chairman. Mr. Zuckerberg, thank you for being here today. You know, you've talked about your very humble beginnings in starting Facebook in — in your dorm room, which I appreciated that story but certainly Facebook has changed an awful lot over a relatively short period of time. When Facebook launched it's timeline feature, consumers saw their friends post chronologically, was the process.

But Facebook has since then changed to a timeline driven by some very sophisticated algorithms. And I think it has left many people, as a result of that, asking, you know, why — why am I seeing this — this feed and why am I seeing this right now. And now, in light of the Cambridge Analytica issue, Facebook users are asking, I think, some new questions right now.

Can I believe what I'm seeing and who has access to this information about me? So I think it's safe to say, very simply, that Facebook is losing the trust of an awful lot of Americans as a result of this incident. And — and I think an example of this is something that I've been hearing a lot from folks that have been coming up to me and talking about really, kind of the experience they've had, where they're having a conversation with friends.

Not on the phone, just talking. And then they see ads popping up fairly quickly on their Facebook. So I've heard constituents fear that Facebook is mining audio from their mobile devices for the purpose of ad targeting. Which I think speaks to this lack of trust that we're seeing here, but — and I understand there's some technical issues and logistical issues for that to happen.

But for the record, I think it's clear — see, I hear it all the time, including from my own staff. Yes or no, does Facebook use audio obtained from mobile devices to enrich personal information about its users?

ZUCKERBERG: No.

PETERS: The ...

ZUCKERBERG: Well, senator, let me be — let me be clear on this. So you're — you're talking about this conspiracy theory that gets passed around that we listen to what's going on, on your microphone and use that for ads.

PETERS: Right.

ZUCKERBERG: We don't do that. To be clear, we do allow people to take videos on their — on their devices and — and share those. And of course videos also have audio, so — so we do, while you're taking a video, record that and use that to make the service better by making sure that your videos have audio. But I — I mean that, I think, is pretty clear, but I just wanted to make sure I was exhaustive there.

PETERS: Well, I appreciate that. And hopefully that'll dispel a lot of what I've been hearing, so thank you for saying that. Certainly the — today, in the era of mega data, we are finding that data drives everything, including consumer behavior. And so consumer information's probably the most valuable information you can get in the data ecosystem.

And certainly folks, as you've mentioned in your testimony here, people like the fact that they can have targeted ads that they're going to be interested in as opposed to being bombarded by a lot of ads that they don't have any interest in; and that consumer information is important in order for you to tailor that.

But also, people are now beginning to wonder is there an expense to that when it comes to perhaps exposing them to being manipulated or through deception. You've talked about artificial intelligence, you brought that up many times during your testimony. And I know you've employed some new algorithms to target bots, bring down fake accounts, deal with terrorism, things that you've talked about in this hearing.

PETERS: But you also know that artificial intelligence is not without its risk and that you have to be very transparent about how those algorithms are constructed. How do you see artificial intelligence, more specifically, dealing with the ecosystem by helping to get consumer insights, but also keeping consumer privacy safe.

ZUCKERBERG: Senator, I think the — the core question you're asking about, A.I. transparency, is a really important one that people are just starting to very seriously study, and that's ramping up a lot. And I think this is going to be a very central question for how we think about A.I. systems over the next decade and beyond.

Right now, a lot of our A.I. systems make decisions in ways that people don't really understand.

PETERS: Right.

ZUCKERBERG: And I don't think that in 10 or 20 years, in the future that we all want to build, we want to end up with systems that people don't understand how they're making decisions.

So having — doing the research now to make sure that the — that these systems can have those principles as we're developing them, I think is certainly a — an extremely important thing.

PETERS: Well, you bring up the — the principles. Because, as you're well aware, A.I. systems, especially in very complex environments when you have machine learning, it's sometimes very difficult to understand, as you mentioned, exactly how those decisions were arrived at. There's examples of how decisions are made in a discriminatory basis, and they can compound if you're not very careful about how that occurs.

And so, is your company — you mentioned principles. Is your company developing a set of principles that are going to guide that development?

And would you provide details to us as to what those principles are and how they will help deal with this issue?

ZUCKERBERG: Yes, senator.

I can make sure that our team follows up and gets you the information on that.

And we have a whole A.I. ethics team that is working on developing basically the technology. It's not just about philosophical principles; it's also a technological foundation for making sure that this goes in the direction that we want.

PETERS: Thank you.

THUNE: Thank you, Senator Peters.

We'll recess for five, and come back in. So we'll give Mr. Zuckerberg a quick break here. Thanks.

(RECESS)

THUNE: We're back. Final stretch.

And Senator Tillis is recognized.

TILLIS: Thank you, Mr. Zuckerberg, for being here.

I think you've done a good job. I've been here for most of it — the session, except for about 20 minutes I watched on television back in my office.

I'm was googling earlier — actually, going on my Facebook app on my phone earlier, and I found one of your Facebook page — yeah, one of your Facebook presents is — it was the same one on March 30th, I think you posted a pic of a first stater. But further down, you listed out the facts since the new platform was released in 2007; sort of, a timeline. You start with 2007, then you jump to the Cambridge Analytica issue.

I actually think that we need to fully examine what Cambridge Analytica did. They either broke a kind of code of conduct. If they broke any other rules or agreements with you all, I hope that they suffer the consequences.

TILLIS: But I think that timeline needs to be updated. And it really needs to go back — I've read a series of three articles that were published in the MIT Technology Review back in 2012, and it talks about how proud the Obama campaign was of exploiting data on Facebook in the 2012 campaign.

In fact, somebody asked you earlier if it made you mad about what Cambridge Analytica did, and you rightfully answered yes, but I think you should probably be equally mad when a former campaign director of the Obama campaign proudly tweeted “Facebook was surprised we were able to suck out the social graph, but they didn't stop us once they realized that was what we were doing.”

So you clearly had some people in your employ that apparently knew it, at least that's what this person said on Twitter, and thank goodness for Wayback and some of the other history-grabber machines. I'm sure we can get this tweet back and get it in the right context. I think when you do your research, it's important to get the whole view. I've worked in data-analytics practice for a good part of my career, and for anybody to pretend that Cambridge Analytica was the first person to exploit data, clearly doesn't work or hasn't worked in the data-analytics field.

So when you go back and do your research on Cambridge Analytica, I would personally appreciate it if you'd start back from the first known high-profile national campaign that exploited Facebook data.

In fact, they published an app that said it would grab information about my friends, their birth dates, locations and likes. So presumably if I downloaded that app that was published by the Obama campaign, I've got 4,900 friends on my Facebook page. I delete the haters and save room for family members and true friends on my personal page, as I'm sure everybody does. And that means if I clicked yes on that app, I would have approved the access of birth dates, locations, and likes of some 4,900 people without their consent. So as you do the chronology, I think it'd be very helpful so that we can take away the partisan rhetoric that's going on like this is a Republican-only issue. It's a — it's a broad based issue that needs to be fixed. And bad actors at either end of the political spectrum need to be held accountable, and I — and I trust that you all are going to work on that.

I think the one thing that I — so for that, I just want to get to the facts, and there's no way you could answer any of the questions, I'm not going to burden you with that. But I think getting that chronology would be very helpful.

The one thing I would encourage people to do is go to Facebook. I'm — I'm a proud member of Facebook, just got a post from my sister on this being National Sibling Day, so I've connected with four or five of my staff while I was giving you my undivided — or family undivided attention. But go to the privacy tab. If you don't want to share something, don't share it. This is a free service. Go on there and say I don't want to allow third party search engines to get in my Facebook page. Go on there and say only my friends can look at it. Go on there and understand what you're signing up for. It's a free app.

Now you need to do more. And I think it would be helpful. I didn't read your disclaimer page or the terms of use, because that is anywhere in there that I could get an attorney and negotiate the terms. So it was a terms of use. I went on there then I used the privacy settings to be as safe as I could be with a presence on Facebook.

Last thing, we talk about all these proposed legislation, good ideas, but I have one question for you: When you were developing this app in your dorm, how many people did you have in your regulatory affairs division? Exactly. So if government takes a handy — heavy-handed approach to fix this problem, then we know very well that the next Facebook, the next thing that you're going to wake up and worry about how you continue to be relevant as the behemoth that you are today, is probably not going to happen.

TILLIS: So we — I think that there's probably a place for some regulatory guidance here, but there's a huge place for Google, Snapchat, Twitter, all the other social-media platforms to get together and create standards. And I also believe that that person who may have looked the other way when the whole social graph was extracted for the Obama campaign, if they're still working for you, they probably shouldn't, or at least there should be a business code of conduct that says, you don't play favorites, you're trying to create a fair place for people to share their ideas.

Thank you for being here.

THUNE: Thank you, Senator Tillis.

Senator Harris.

HARRIS: Thank you. Thank you for being here.

I've been here for — on and off for the last four hours since you've been testifying. And I have to tell you, I'm concerned about how much Facebook values trust and transparency, if we agree that a critical component of relationship of trust and transparency is we speak truth and we get to the truth.

During the course of this hearing, these last four hours, you have been asked several critical questions for which you don't have answers. And those questions have included whether Facebook can track user's browsing activity even after the user has logged off of Facebook, whether Facebook can track your activity across devices even when you are not logged into Facebook. Who is Facebook's biggest competition? Whether Facebook may store up to 96 categories of user's information. Whether you knew whether Kogan's terms of service and whether you knew if that Kogan could sell or transfer data.

And then another case in point, specifically as it relates to Cambridge Analytica is, and a concern of mine, is that you — meaning Facebook — and I'm going to assume you personally as a CEO — became aware of December 2015 that Dr. Kogan and Cambridge Analytica misappropriated data from 87 million Facebook users. That's 27 months ago that you became — as Facebook — and perhaps you personally became aware. However a decision was made not to notify the users.

So my question is, did anyone at Facebook have a conversation at the time that you became aware of this breach, and have a conversation where in the decision was made not to contact the users?

ZUCKERBERG: Senator, I don't know if there were any conversations at Facebook overall because I wasn't in a lot of them. But ...

HARRIS: On that subject.

ZUCKERBERG: Yes. I mean, I'm not sure what other people discussed. Are — at the time — in 2015 we heard the report that this developer, Aleksandr Kogan, had sold data to Cambridge Analytica. That's in violation of our times.

HARRIS: Correct, and were you apart of a decision — were you part of a discussion that resulted in a decision not to inform your users?

ZUCKERBERG: I don't remember a conversation like that. But the reason why ...

HARRIS: Are you aware of anyone in the leadership at Facebook who was in a conversation where a decision was made not to inform your users? Or do you believe no such conversation ever took place?

ZUCKERBERG: I'm not sure whether there was a conversation about that. But I can tell you the thought process at the time of the company, which was that in 2015, when we heard about this, we banned the developer and we demanded that they delete all of the data and stop using it, and same with Cambridge Analytica.

(CROSSTALK)

HARRIS: And I've heard your testimony in that regard, but I'm talking about notification of the users. And this relates to the issue of transparency and the relationship of trust, informing the user about what you know in terms of how their personal information has been misused.

And I'm also concerned that when you personally became aware of this, did you or senior leadership do an inquiry to find out who at Facebook had this information, and did they not have a discussion about whether or not the users should be informed back in December 2015?

ZUCKERBERG: Senator, in retrospect, I think we clearly viewed it as a mistake that we didn't inform people and we did that based on false information that we thought that the case was closed and that the data had been deleted.

HARRIS: So there was a decision made on that basis not to inform the users. Is that correct?

ZUCKERBERG: That's my understanding. Yes.

HARRIS: Okay. And ...

ZUCKERBERG: But I — I — in retrospect I think that was a mistake and knowing what we know now, we should have handled a lot of things here differently.

HARRIS: I appreciate that point. Do you know when that decision was made not to inform the users?

ZUCKERBERG: I don't.

HARRIS: Okay. Last November the Senate Intelligence Committee held a hearing on social media influence. I was a part of that hearing. I submitted 50 written questions to Facebook and other companies and the responses that we received were unfortunately evasive and some were frankly nonresponsive. So I'm going to ask the question again here. How much revenue did Facebook earn from the user engagement that resulted from foreign propaganda?

ZUCKERBERG: Well senator, what we do know is that the IRA, the Internet Research Agency, the — the Russian firm ran about $100,000 worth of ads. I can't say that we've identified all of the foreign actors who are involved here. So, I — I — I can't say that that's all of the money but that is what we have identified.

HARRIS: Okay. My time is up. I'll submit more questions for the record. Thank you.
THUNE:

Thank you Senator Harris. Next up is Senator Kennedy.

KENNEDY: Mr. Zuckerberg, I come in peace.

(LAUGHTER)

I — I don't want to vote to have to regulate Facebook, but by God I will. That — a lot of that depends on you. I'm a little disappointed in this hearing today. I just don't feel like that we're connecting. So — so let me try to lay it out for you from my point of view.

I think you are a really smart guy. And I think you have built an extraordinary American company and you've done a lot of good. Some of the things that you've been able to do are magical.

But our — our promised digital utopia we have discovered has minefields. There — there's some impurities in the Facebook punch bowl. And they've got to be fixed and I think you can fix them. Now here — here's what's going to happen. There are going to be a whole bunch of bills introduced to regulate Facebook. It's up to you whether they pass or not.

You can go back home, spend $10 million on lobbyists and fight us or you can go back home and help us solve this problem and they're two. One is a privacy problem the other one is what I call a propaganda problem. Let's start with the privacy problem first. Let's start with the user agreement.

Here's what everybody's been trying to tell you today, and — and I say this gently. Your user agreement sucks.

(LAUGHTER)

You're — you — you can spot me 75 IQ points, if I can figure it out, you can figure it out. The purpose of that user agreement is to cover Facebook's rear end. It's not to inform your users about their rights.

KENNEDY: Now, you know that and I know that. I'm going to suggest to you that you go back home and rewrite it. And tell your $1,200 an hour lawyers, no disrespect. They're good. But — but tell them you want it written in English and non-Swahili, so the average American can understand it. That would be a start.

Are you willing — as a Facebook user, are — are you willing to give me more control over my data?

ZUCKERBERG: Senator, as someone who uses Facebook, I believe that you should have complete control over your data.

KENNEDY: Okay. Are — are you willing to go back and — and work on — on giving me a greater right to erase my data?

ZUCKERBERG: Senator, you can already delete any of the data that's there, or delete all of your data.

KENNEDY: Are — are you willing to expand that, work on expanding that?

ZUCKERBERG: Senator, I think we already do what you're referring to. But certainly, we're always working on trying to make these controls easier.

KENNEDY: Are — are you willing to expand my right to know who you're sharing my data with?

ZUCKERBERG: Senator, we already give you a list of apps that — that you're using. And you signed into those yourself, and provided affirmative consent. As I've said before ...

KENNEDY: Right. But when I use — on that — on that — on that user agreement ...

ZUCKERBERG: ... we don't share any data with ...

KENNEDY: ... are — are you willing to expand my right to prohibit you from sharing my data?

ZUCKERBERG: Senator, again, I believe that you already have that control. So, I mean, I think people have that — that full control in the system already today. If we're not communicating this clearly, then that's a big thing that we should work on. Because I think the principles that you're articulating are the ones that we believe in and try to codify in the product that we build.

KENNEDY: Are — are you willing to give me the right to take my data on Facebook and move it to another social media platform?

ZUCKERBERG: Senator, you can already do that. We have a download-your-information tool, where you can go get a file of all the content there, and then do whatever you want with it.

KENNEDY: And you're — are — then I assume you're willing to give me the right to say, “I'm going to go in your platform, and you're going to be able to tell a lot about me as a result, but I don't want you to share it with anybody”?

ZUCKERBERG: Yes, senator. And I believe you already have that ability today. People can sign on and choose to not share things, and just follow some friends or some pages and read content if that's what they want to do.

KENNEDY: Okay. Let me be sure I under — I'm about out of time. Oh, it goes fast, doesn't it? Let me ask you one final question in my 12 seconds. Could somebody call you up and say, “I want to see John Kennedy's file”?

ZUCKERBERG: Absolutely not.

KENNEDY: Could you — if — not — not — could you — not would you do it. Could you do it?

ZUCKERBERG: In — in theory.

KENNEDY: Do you have the right to put my data, a name on my data and share it with somebody?

ZUCKERBERG: I do not believe we have the right to do that.

KENNEDY: Do you have the ability?

ZUCKERBERG: Senator, the data is in the system. So ...

KENNEDY: Do you have the ability?

ZUCKERBERG: Technically, I think someone could do that. But that would be a massive breach. So we would never do that.

KENNEDY: It would be a breach?

Thank you, Mr. Chairman.

THUNE: Thank you, Senator Kennedy. Senator Baldwin's up next.

BALDWIN: Thank you, Mr. Chairman.

Thank you for being here and enduring a long day, Mr. Zuckerberg. I want to start with what I hope can be a quick round of — of questions, just so I make sure I understand your previous testimony, specifically with regard to the process by which Cambridge Analytica was able to purchase Facebook users' data. So it was an app developer, Aleksandr Kogan. He collected data via a personality quiz. Is that correct?

ZUCKERBERG: Yes.

BALDWIN:Okay.

And he thereby is able to gain access of not only the people who took the quiz but their network, is that correct, too?

ZUCKERBERG: Senator, yes. The terms of the platform at the time allowed for people to share their information and some basic information about their friends as well. And we've since changed that, as of 2014.

BALDWIN: And ...

ZUCKERBERG: Now, that's not possible.

BALDWIN: And so, in total about 87 million Facebook users. You earlier testified about the two types of ways you gain data. One is what is voluntarily shared by Facebook members and users. And the other is in order to — I think you said improve your advertising experience, whatever that exactly means — the data that Facebook collects in order to customize or focus on that.

Did — was Aleksandr Kogan able to get both of those sets or data, or just what was voluntarily entered by the user?

ZUCKERBERG: Yes, that's a good question. It was just a subset of what was entered by the person. And ...

BALDWIN: So, a subset of the 95 categories of data that you keep?

ZUCKERBERG: Yes, when you sign into the app ...

BALDWIN: Okay.

ZUCKERBERG: ... you — the app developer has to say, here are the types of data from you that I'm asking for, including public information like your name and profile, the pages you follow, other interests on your profile, that kind of content.

BALDWIN: Okay.

ZUCKERBERG: The app developer has to disclose that up front, and you agree with it.

BALDWIN: Okay. So, in answer to a couple of other senators' questions, specifically Senator Fischer, you talked about Facebook storing this data and I think you just talked about this data being in the system. I wonder if, outside of the way in which Aleksandr Kogan was able to access this data, whether you — could Facebook be vulnerable to a data breach or hack, why or why not?

ZUCKERBERG: Well, there are many kinds of security threats that a company like ours faces, including people trying to break in to our security systems.

BALDWIN: Okay. And if you believe that you had been hacked, do you believe you would have the duty to inform those who were impacted?

ZUCKERBERG: Yes.

BALDWIN: Okay. Do you know whether Aleksandr Kogan sold any of the data he collected with anyone other than Cambridge Analytica?

ZUCKERBERG: Senator, yes, we do. He sold it to a couple of other firms.

BALDWIN: Can you identify them?

ZUCKERBERG: Yes, there's one called Eunoia, and there may have been a couple of others as well. And I can follow up with ...

BALDWIN: Can you furnish that to me after?

ZUCKERBERG: Yes.

BALDWIN: Thank you. I appreciate that.

And then, how much do you know, or have you tried to find to find out how Cambridge Analytica used the data while they had it, before you believe they deleted it?

ZUCKERBERG: Since we just heard that they didn't delete it about a month ago, we've kicked off an internal investigation to see if they used that data in any of their ads, for example. That investigation is still underway, and we will — we can come back to the results of that once we have that.

BALDWIN: Okay. I want to switch to my home state of Wisconsin.

According to press reports, my home state of Wisconsin was a major target of Russian-bought ads on Facebook in the 2016 election. These divisive ads, touching on a number of very polarizing issues, were designed to interfere with our election. We've also learned that Russian actors using another platform, Twitter, similarly targeted Wisconsin with divisive content aimed at sowing division and dissent, including in the wake of a police-involved shooting in Milwaukee's Sherman Park neighborhood in August of 2016.

Now I find some encouragement in the steps you've outlined today to provide greater transparency regarding political ads. I do want to get further information on how you can be confident that you have excluded entities based outside of the United States.

ZUCKERBERG: We'll follow up on that.

BALDWIN: And then, I think on that topic, if you require disclosure of a political ad's sponsor, what sort of transparency will you be able to provide with regard to people who weren't the subject of that ad seeing its content?

ZUCKERBERG: Senator, you'll be able to go to any page and see all of the ads that that page has run. So if someone is running a political campaign, for example, and they're targeting one district with one ad and another district with another, historically it has been hard to track that down, but now it will be very easy.

You'll just be able to look at all of the ads that they've run, the targeting associated with each to see what they're saying to different folks, and in some cases how much they're spending on the ads, and all of the relevant information.

This is an area where I think more transparency will really help discourse overall and root out foreign interference in elections.

THUNE: Thank you, Senator Baldwin.

BALDWIN: And will you ...

THUNE: Senator Johnson.

JOHNSON: Thank you, Mr. Chairman.

Thank you, Mr. Zuckerberg, for testifying here today. Do you have any idea how many of your users actually read the terms of service, the privacy policy, the statement of rights and responsibilities? I mean, actually read it?

ZUCKERBERG: Senator, I do not.

JOHNSON: Would you imagine it's a very small percentage?

ZUCKERBERG: Senator, who read the whole thing? I would imagine that probably most people do not read the whole thing. But everyone has the opportunity to and consents to it.

JOHNSON: Well, I agree. But that's kind of true of every application where, you know, you want to get to it and you have to agree to it, and people just press that “agree,” the vast majority, correct?

ZUCKERBERG: Senator, it's really hard for me to make a full assessment, but ...

JOHNSON: Common sense would tell you that would be probably the case.

With all this publicity, have you documented any kind of backlash from Facebook users? I mean, has there been a dramatic falloff in the number of people who utilize Facebook because of these concerns?

ZUCKERBERG: Senator, there has not.

JOHNSON: You haven't even witnessed any?

ZUCKERBERG: Senator, there was a movement where some people were encouraging their friends to delete their account and I think that got shared a bunch.

JOHNSON: So it's kind of safe to say that Facebook users don't seem to be overly concerned about all these revelations, although obviously Congress apparently is.

ZUCKERBERG: Well, senator, I think people are concerned about it. And I think these are incredibly important issues that people want us to address. And I think people have told us that very clearly.

JOHNSON: But it seems like Facebook users still want to use the platform because they enjoy sharing photos and they share the connectivity with family members, that type of thing. And that overrides their concerns about privacy.

You talk about the user owns the data, you know, there are a number — have been a number of proposals of having that data stay with the user and allow the user to monetize it themselves. Your COO, Ms. Sandberg, mentioned possibly, if you can't utilize that data to sell advertising, perhaps we would charge people to go onto Facebook.

JOHNSON: Have you thought about that model, where the user data is actually monetized by the actual user?

ZUCKERBERG: Senator, I'm not sure exactly how — how it would work for it to be monetized by the person directly. In general, where — we believe that the ads model is the right one for us because it aligns with our social mission of trying to connect everyone and bring the world closer together.

JOHNSON: But — but you're aware of people making that kind of proposal, correct?

ZUCKERBERG: Yes. I — Senator, a number of people suggest that — that we should offer a version where people cannot have ads if they pay a monthly subscription, and certainly we consider ideas like that. I think that they're reasonable ideas to — to think through. But overall, the — I think that the ads experience is going to be the best one. I think in general, people like not having to pay for a service. A lot of people can't afford to pay for a service around the world, and this aligns with our mission the best.

JOHNSON: You answered Senator Graham when he asked you if you thought you were monopoly. That you didn't think so. You're obviously a big player in the space. That might be an area for competition, correct, if somebody else wants to create a social platform that allows a user to monetize their own data?

ZUCKERBERG: Senator, yes. There are lots of new social apps all the time. And as I said before, the average American I think uses eight different communication and social apps. So there's a lot of different choice and a lot of innovation and activity going on in this space.

JOHNSON: I want — in a very short period of time. You talked about the difference between advertisers and application developers. Because those — again, you — you said in earlier testimony that advertisers have no access to data whatsoever. But application developers do? Now, is that only through their own service agreement with their customers, or do they actually access data as they're developing applications?

ZUCKERBERG: Senator, this is an important distinction, so thanks for giving me the opportunity to clarify this. People — we give people the ability to take their data to another app if they want. And this is a question that Senator Kennedy asked me just a few minutes ago.

The reason why we designed the platform that way is because we — we thought it would be very useful to make it so that people could easily bring their data to other — to other services. Some people inside the company argued against that at the time because they were worried that — they said hey, we should just make it so that we can be the only ones who develop this stuff, but ...

JOHNSON: But again, that's — that's the ...

ZUCKERBERG: ... we thought that that was a — a useful thing for people to ...

JOHNSON: ... that's the user agreeing to allow you to share — when they're using that app, to allow Facebook to share that data. Does the developer ever have access to that prior to users using it? Meaning developing the application. Because you used the term “scraped” data. What does that mean? Who scraped the data?

ZUCKERBERG: Yes, senator. This is a good question. So there's the developer platform, which is the sanctioned way that an app developer can ask a person to access information. We also have certain features and certain things that are public, right? A lot of the information that people choose to put on Facebook, they're sharing with everyone in the world. Not privately, but, you know, you put your name, you put your profile picture, that's public information that people put out there. And sometimes people who aren't registered developers at Facebook try to load a lot of pages in order to get access to a bunch of people's public information and aggregate it.

We fight back hard against that, because we don't want anyone to aggregate information, even if people made it public and chose to share it with everyone.

JOHNSON: Okay. Thank you, Mr. Chairman.

THUNE: Thank you, Senator Johnson.

Senator Hassan?

HASSAN: Thank you, Mr. Chair.

Thank you, Mr. Zuckerberg, for being here today. I want to talk to a couple of broader issues. I'm concerned that Facebook's profitability rests on two potentially problematic foundations. And we've heard other senators talk about this a little today. The foundations are maximizing the amount of time people spend on your products and collecting people's data.

HASSAN: I've looked at Facebook's 2017 corporate financial statement, where you lay out some of the major risks to your business. One risk is a decrease in, and I quote, “user engagement, including time spent on our products.” That concerns me because of the research we've seen suggesting that too much time spent on social media can hurt people's mental health, especially young people.

Another major risk to your business is the potential decline in — and here's another quote — “the effectiveness of our ad targeting or the degree to which users opt out of certain types of ad targeting, including as a result of changes that enhance the user's privacy.”

There's clearly tension, as other senators have pointed out, between your bottom line and what's best for your users. You've said in your testimony that Facebook's mission is to bring the world closer together, and you've said that you will never prioritize advertisers over that mission. And I believe that you believe that.

But at the end of the day, your business model does prioritize advertisers over the mission. Facebook is a for-profit company, and as the CEO you have a legal duty to do what's best for your shareholders. So given all of that, why should we think that Facebook, on its own, will ever truly be able to make the changes that we need it to make to protect American's well-being and privacy?

ZUCKERBERG: Well, senator, you've raised a number of important points in there, so just let me respond in ...

HASSAN: Sure.

ZUCKERBERG: ... in a couple of different ways. The first is that I think it's really important to think about what we're doing, is building this community over the long term. Any business has the opportunity to do things that might increase revenue in the short term, but at the expense of trust or building engagement over time. What we actually find is not necessarily that increasing time spent, especially not just in the short term, is going to be best for our business.

It actually — it aligns very closely with — with the well-being research that we've done. That when people are interacting with other people, and posting and basically building relationships, that is both correlated with higher measures of well-being, health, happiness, not feeling lonely, and that ends up being better for the business than when they're doing lower value things like just passively consuming content.

So I think that that's — that's an important point to — to ...

HASSAN: Okay, but — and I understand the point that you're trying to make here, but here's what I'm concerned about. We have heard this point from you over the last decade-plus. Since you've founded Facebook — and I understand it — you've — you founded it pretty much as a solo entrepreneur with your roommate.

But now, you know, you're sitting here at the head of a bazillion dollar company, and we've heard you apologize numerous times and promise to change, but here we are again, right? So I really firmly believe in free enterprise, but when private companies are unwilling or unable to do what's necessary, public officials have, historically, in every industry, stepped up to protect our constituents and consumers.

You've supported targeted regulations, such as the Honest Ads Act, and that's an important step for election integrity, I'm proud to be a co-sponsor of that bill. But we need to address other, broader issues as well. And today you've said you'd be open to some regulation, but this has been a pretty general conversation. So will you commit to working with Congress to develop ways of protecting constituent privacy and well-being, even if it means that that results in some laws that will require you to adjust your business model?

ZUCKERBERG: Senator, yes. We will commit to that. I think that that's an important conversation to have. Our position is not that regulation is bad. I think the Internet is so important in people's lives, and it's getting more important.

HASSAN: Yes.

ZUCKERBERG: The expectations on Internet companies and technology companies overall are growing, and I think the real question is, “what is the right framework for this?” not “should there be one?”

HASSAN: That is very helpful, and I think the other question — and it doesn't just go to Facebook — is whether the framework should include financial penalties when large providers, like Facebook, are breached and privacy is compromised as a result. Because right now, there is very little incentive for whether it's Facebook or Equifax to actually be aggressive in protecting customer privacy and looking for potential breeches or vulnerabilities in their systems.

So what we hear after the fact, after people's privacy has been breached, after they've taken the harm that comes with that, and considerable inconvenience in addition to the harm. We've heard apologies, but there is no financial incentive right now it seems to me for these companies to aggressively stand in their consumers stead and protect their privacy. And I would really look forward to working with you on that, and getting your considered opinion about it.

ZUCKERBERG: Well senator, we — we look forward to — to discussing that with you. I would disagree however that we have no financial incentive or incentive overall to do this. This episode has clearly hurt us, and has clearly made it harder for us to achieve the social mission that we care about. And we now have to a lot of work around building trust back which — which is — is a really important part of this.

HASSAN: Well, I thank you. My time is up and — and I'll follow up with you on that.

GRASSLEY: Senator Capito.

CAPITO: Thank you, Chairman Grassley. Thank you, Mr. Zuckerberg, for being here today.

I — I want to ask just kind of a process question. You've said more than a few times that Facebook users can delete from their own account at any time. Well, we know and of course I do — I've got grandchildren now with children. You tell your children, once you make that mark in — in or in — in the Internet system it never really goes away.

So my question to you is, if once — and I think you answered that — that once an individual deletes the information from their page it's gone forever from Facebook's archives. Is that correct?

ZUCKERBERG: Yes. And I think you raise a good point though, which is that it is — we will delete it from our systems but if you shared something to someone else then we can't guarantee that they don't have it somewhere else.

CAPITO: Okay. So if somebody leaves Facebook and then rejoins and asks Facebook, can you recreate my past, your answer would be?

ZUCKERBERG: If they delete their account, the answer is no. That's why we actually offer two options. We offer deactivation, which allows you to shut down or suspend your account, but not delete the information. Because actually a lot of people want to — at least for some period of time. I mean we hear students with exams coming up want to not be on Facebook because they want to make sure they can focus on the exam. So they deactivate their account temporarily, but then want the ability to turn it back on when they're ready. You can also delete your account, which is wiping everything. If you do that, then you can't get it back.

CAPITO: You can't get it back. It's gone from your archives?

ZUCKERBERG: Yes.

CAPITO: But is it ever really gone?

ZUCKERBERG: From our systems?

CAPITO: From — from the cloud or wherever it — wherever it is. I mean, it always seems to be able to reappear in investigations and other things. Not necessarily Facebook, but some other emails and — and other things of that nature.

What about the information going from the past? The information that's already been in the Cambridge Analytica case? You can't really go back and redo that. So I'm going to assume that what we've been talking with and with the improvements that you're making now at Facebook are from this point forward. Is that a correct assumption?

ZUCKERBERG: Senator, I actually do think we can go back in some cases. And that's why one of the things that I announced is that we're going to be investigating every single app that had access to a large of information before we locked down the platform in 2014. And if we find any pattern of suspicious activity, then we're going to go do a full audit of their systems. And if we find that anyone's improperly using data, than we'll take action to make sure that they delete the data, and we'll inform everyone who — who may have had their data misused.

CAPITO: Okay, other — other suggestion I would make, because we're kind of running out of time here, is you've heard more than a few complaints, and I join the chorus, of the — the lapse in the time of when you discovered and when you became transparent.

And I understand you sent out two messages just today to — to users. So I would say — you say you regret that decision, that you wish you'd been more transparent at the time, so I would imagine if in the course of your investigation, you find more breaches so to speak, that you will be reinforming your Facebook customers.

ZUCKERBERG: Yes, that is correct. We have already committed that if we find any improper use, we will inform everyone affected.

CAPITO: Okay, thank you. You've said also that you want to have an active view on controlling your ecosystem. Last week the FDA Commissioner Scott Gottlieb, addressed the Drug Summit in Atlanta and spoke on the national opioid epidemic.

My state, I'm from West Virginia, and thank you for visiting and next time you visit, if you would please bring some fiber because we don't have connectivity in — in our rural areas like we really need, and Facebook could really help us with that.

So — so Commissioner Gottlieb called up — called upon social media and Internet service providers, and he mentioned Facebook when he talked about it, to try to disrupt the sale — the sale of illegal drugs and particularly the powerful opioid, Fentanyl, which has been advertised and sold online.

I know you have policies against this, the commissioner is announcing his intention to convene a meeting of chief executives and senior leaders and I want to know — can I get a commitment from you today that Facebook will commit to having a representative with Commissioner Gottlieb to finalize with this meeting?

ZUCKERBERG: Senator, that sounds like an important initiative, and we will send someone. And let me also say that on your point about connectivity, we do have a — a group at Facebook that is working on trying to spread Internet connectivity in rural areas, and we would be happy to follow up with you on that as well.

That's something that I'm very passionate about.

CAPITO: That's good. That's good news. Last question I have, just on the advertising, if somebody advertises on Facebook and somebody purchases something, does Facebook get a percentage or any kind of a fee associated with a successful purchase from an advertiser?

ZUCKERBERG: Senator, no. The way that the system works is people — advertisers bid how much it's worth it to them to show an ad or when an action happens. So it's not that we would get a percent of the sale, but let's — let's just use an example.

So let's say you have — you're an app developer, and you — your goal is you want to get more people to install your app. You could bid in the ad system and say I will pay $3 anytime someone installs this app.

And then we basically calculate on — on our side which ads are going to be relevant for people, and we have an incentive to show people ads that are going to be relevant because we only get paid when it delivers a business result, and — and that's how the system works.

CAPITO: So it — it could be one — you could be paid for the advertisement. I mean for the sale.

ZUCKERBERG: We — we get paid when the action of the advertiser wants to — to happen, happens.

CAPITO: All right, thank you.

THUNE: Senator — Senator Cortez Masto?

CORTEZ MASTO: Thank you.

Mr. Zuckerberg, thank you. It's been a long afternoon and I — I appreciate you being here and — and taking the time with every single one of us. I'm going to echo a lot of what I've heard my colleagues say today as well.

I appreciate you being here, appreciate the apology, but stop apologizing and let's make the change. I — I think it's time to really change the conduct. I appreciate the fact that you talked about your principles for Facebook: (inaudible) users on the use of the data, and that users have complete control of their data.

CORTEZ MASTO: But the skepticism that I have and I'm hoping you can help me with this is over the last what, seven years, seven, 14 years — seven years, haven't seen really much change in insuring that the privacy is there and that individual users have control over their data.

So — so let me — let me ask you this. Back in 2009, you made two changes to your privacy policy. And, in fact, prior to that, most users could either identify only friends, or friends of friends as part of their — their privacy, correct? If they wanted to protect their data. They could identify only friends or friends of friends who could see their data. Isn't that correct?

ZUCKERBERG: Senator, I believe that we've had the option for people to share with friends, friends of friends, a custom audience or publicly for a long time. I — I don't remember ...

CORTEZ MASTO: Okay.

ZUCKERBERG: ... exactly when we put that in place, but I believe it was before 2009.

CORTEZ MASTO: So either you can choose only friends or friends of friends to decide how you're going to share that — protect that data, correct?

ZUCKERBERG: Those are two of the options, yes.

CORTEZ MASTO: Okay. And in 2011 when the FTC started taking a look at this, they were concerned that if somebody chose only friends, that the individual user was under the impression they could continue to restrict sharing of data to a limited audience, but that wasn't the case.

And, in fact, selecting friends only did not prevent users' information from being shared with third — third-party applications their friend used. Isn't that the case, and that's why the FTC was looking at — at you and making that change? Because there was concern that if you had friends on your page, a third party could access that information. Isn't that correct?

ZUCKERBERG: Senator, I don't remember the exact context that the ...

CORTEZ MASTO:  So let me — let me help you here. Because David Vladeck who was — spent nearly four years as director of the Federal Trade Commission's Bureau of Consumer Protection, where he worked, including on the FTC's enforcement case against Facebook, basically identifies in this article that was the case.

That not only did Facebook misrepresent — and that's why there were eight counts of deceptive acts and practices — the actual FTC, in November's 2011 decree, basically stated — required Facebook to give users clear and conspicuous notice and to obtain affirmative — let me jump back here — to do three things. The decree barred Facebook from making any further deceptive privacy claims or — and it required Facebook get consumers' approval before changing the way it shares their data. And most importantly, the third thing, it required Facebook to give users clear and conspicuous notice and to obtain affirmative express consent before sharing their data with third parties. That was part of the FTC consent decree, correct?

ZUCKERBERG: Senator, that sounds right to me.

CORTEZ MASTO: Okay. So at that time, you're on notice that there were concerns about the sharing of data and information — users' data including those friends — with third parties, correct?

ZUCKERBERG: Senator, my understanding ...

CORTEZ MASTO: Well, let me ask you this. Let me do it this way. In response to the FTC consent to make those changes, did you make those changes and what did you do to ensure individuals' user data was protected and they had notice of that information and that potentially third parties would be accessing that and they had to give express consent? What did you specifically do in response to that?

ZUCKERBERG: Senator, a number of things. One of the most important parts of the FTC consent decree that we signed was establishing a robust privacy program at the company, headed by our chief privacy officer, Erin Egan. We're now ...

CORTEZ MASTO: Can you give me specifics? And I know — and — and I've heard this over and over again. I'm running out of time. But here's the concern that I have. It can't be a privacy policy because that's what the consent said it couldn't be.

It had to be something very specific, something very simple, like you've heard from my colleagues. And that did not occur. Had that occurred, we wouldn't be here today talking about Cambridge Analytica.

CORTEZ MASTO: Isn't that really true? Had you addressed those issues then, had you done an audit, had you looked at not only the third-party applications, but audited their associated data storage as well, you would have known that this type of data information was being shared.

And that's our concern and that's what I'm saying now, time just to make a change. It's time to really address the privacy issue. It's time to really come and lead the country on this issue and how we can protect individual user's data and information. I know my time is running out, but I appreciate you being here and I'm just hoping that you're committed to working with us in the future in addressing these concerns.

THUNE: Thank you, Senator Cortez Masto.

Senator Gardner?

GARDNER: Thank you, Mr. Chairman.

And thank you, Mr. Zuckerberg, for your patience and testimony today. The end is near, I think, one, two, three or four people. So that's good news, to get out of this hearing.

A couple questions for you, to clarify one of the comments made about deleting accounts from Facebook. In the user agreement it says when you delete I.P. content, if — if it is deleted in manner similar to — it is deleted in a manner similar to emptying the recycle bin on a computer. However, you understand that removed content may persist in backup copies for a reasonable period of time. How long is that?

ZUCKERBERG: Senator, I don't know, sitting here, what our current systems are on that. But the intent is to get all the content out of the system as quickly as possible.

GARDNER: And does that mean your user data as well? It talks about I.P. content, is that the same thing as your user data; it can sit in backup copies?

ZUCKERBERG: Senator, I think that that is probably right. I — I don't — I'm not sitting here today having full knowledge of — of our current state of the systems around wiping all of the data out of backups. So I can follow up with you on that afterwards, but what I can tell you ...

GARDNER: But all backups get wiped?

ZUCKERBERG: That is certainly the way it's — it — it's supposed to work.

GARDNER: Has there ever been a failure of that?

ZUCKERBERG: Senator, I — I don't know. But this is — if we tell people that we're going to delete their data then we need to do that.

GARDNER: And you do, do that?

ZUCKERBERG: (OFF-MIKE)

GARDNER: Thank you.

Mr. Zuckerberg, a couple of other questions I think that gets to the heart of this expectation gap as I call it, with — with the users. Facebook, as I understand it, if you're logged in to Facebook with a separate browser and you log in to another — log in to another article, open a new tab in the browser while you have the Facebook tab open, and that new tab has a Facebook button on it, you track the article that your reading. Is that correct?

ZUCKERBERG: Senator, I ...

GARDNER: In the new tab.

ZUCKERBERG: ... I think that there — there is functionality like that, yes.

GARDNER: Do you think users understand that?

ZUCKERBERG: Senator, I think that they — that there is a reasonable — the — I think the answer's probably yes for the following reason, because when we show a “Like” button on a website, we show social context there. So, it says here are your friends who liked that. So in order to do that, we would have to ...

GARDNER: But if — but if you've got your Facebook browser open and you open up the article in the Denver Post, and it has a Facebook button on it, you think they know — consumers, users know, that Facebook now knows what article you're reading in the Denver Post?

ZUCKERBERG: Well, we would need to have that in order to serve up that — the — the like button and show you who your friends were who had also liked that.

GARDNER: So, I — I — I — and I think that goes to the heart of this expectation gap because I don't think consumers, users necessarily understand that. I mean, in going through this user agreement, as others have, you do need a lawyer to understand it. And I hope that you can close that expectation gap by simplifying the user agreement, making sure that people understand their privacy.

Has there ever been a violation outside of the — the — the talk about Cambridge Analytica about the privacy settings? Has a privacy setting violation ever occurred outside of Cambridge Analytica?

ZUCKERBERG: I'm not aware that we have had systems that have ...

GARDNER: So the privacy setting a — a — a consumer, a user uses, have always been respected? There's never been an instance where those privacy settings have been violated?

ZUCKERBERG: That's my understanding. I mean, this is the core thing that our company does is — you come to Facebook, you say, hey, I want to share this photo or I want to send this message to these people. And then ...

(CROSSTALK)

GARDNER: Has there ever been a breach of Facebook data or a hack?

ZUCKERBERG: There have been — I don't believe there has been a breach of data that we are aware of.

GARDNER: Has there been a hack?

ZUCKERBERG: Yes.

GARDNER: Have those hacks accessed user data?

ZUCKERBERG: I don't believe so. I think we had an instance in 2013 where someone was able to install some malware on a few employees' computers and had access to some content on their computers, but I don't believe ...

GARDNER: Never to affect the user of the page? Never affected the user page?

ZUCKERBERG: I do not believe so.

GARDNER: Okay. Has the government ever asked to remove a page, have a page removed?

ZUCKERBERG: Senator, I believe so.

GARDNER: Okay, and has the government ever — can you get a warrant to join a page to get to be on a page — pretending you're a separate user, to be liked by that, to track what that person's doing. Do you need a warrant for that or can the government just do that? The FBI? Anybody?

ZUCKERBERG: Senator, I'm not sure I fully understand. You're saying ...

GARDNER: We can follow up on that, because I do have one final question I want to ask you.

A couple days ago, I think Facebook talked about that it would label traditional advocacy as political ads. And for instance, if the Sierra Club was to run a climate change ad that would be labeled political — a political ad. If the Chamber of Commerce wanted to run or place an ad as this would be a — this would have an impact on — the climate change regulations would have an impact to talk about that through an ad, that would be labeled as political, which is different than current standards of what is political and issue advocacy.

Is it your intent to label things political that would be in contradiction to federal law?

ZUCKERBERG: Senator, the intent of what we're trying to get at is the foreign election interference we've seen has taken more of the form of issue ads than direct political electioneering advertising. So because of that, we think it's important to extend the verification and transparency to issue ads in order to block the kind of interference that the Russians attempted to do, and I think will likely continue to attempt to do. That's why I think those measures are important to do.

GARDNER: Thank you.

ZUCKERBERG: Thank you, Senator Gardner. Senator Tester.

TESTER: Thank you, Mr. Chairman. I want to thank you for being here today, Mark. I appreciate you coming in. I hope this isn't the last time we see you in front of committee. I know this is — we're approaching five hours, so it's been a little tenuous. Some mental gymnastics for all of us, and I just want to thank you for being here.

Facebook is an American company, and with that, I believe you've got a responsibility to protect American liberties central to our privacy. Facebook allowed a foreign company to steal private information. They allowed a foreign company to steal private information from tens of millions of Americans, largely without any knowledge of their own.

Who and how we choose to share opinions is question of personal freedom. Who we share our likes and dislikes with is a question of personal freedom. This is a troubling episode that completely shatters that's liberty, so that you understand the magnitude of this. Montanans deeply concerned — they are deeply concerned with this breach of privacy and trust.

TESTER: So you've been at this nearly five hours today. So besides taking reactive steps — and I want you to be as concise as you possibly can — what are you doing to make sure what Cambridge Analytica did, never happens again?

ZUCKERBERG: Thank you, senator.

There are three important steps that we're taking here. For Cambridge Analytica, first of all, we need to finish resolving this by doing a full audit of their systems to make sure that they delete all the data that they have and so we can fully understand what happened. There are two sets of steps that we're taking to make sure that this doesn't happen again.

The most important is restricting the amount of accessed information that developers will have going forward. The good news here is that back in 2014, we actually had already made a large change to restrict access on the platform that would have prevented this issue with Cambridge Analytica from happening again today. Clearly we did not do that soon enough.

If we'd done it a couple of years earlier, then we probably wouldn't be sitting here today. But this isn't a change that we had to take now in 2018, it's largely a change that we made back in 2014.

TESTER: Okay.

ZUCKERBERG: There were other parts of the platform that we also similarly can lock down now to make sure that other issues that might have been exploited, in the future won't be able to. And we've taken a number of those steps and I've outlined those in — in my written statement as well.

TESTER: I appreciate that. And you feel confident that the actions you've taken thus far — whether it was ones back in 2014 or the one that you just talked about, about locking the other parts — will adequately protect the folks that use Facebook?

ZUCKERBERG: Senator, I believe so ...

TESTER: Okay.

ZUCKERBERG: ... although security is never a solved problem.

TESTER: That's all I need. You talked about a full audit of the — of Cambridge Analytica systems. Can you do a full audit if that information's stored somewhere — some other country?

ZUCKERBERG: Senator, if — right now, we're waiting on the audit because the U.K. government is doing a government investigation of them.

TESTER: Okay, but ...

ZUCKERBERG: And I do believe that the government will have the ability to get into the systems even if we can't ...

TESTER: If information is stored in the U.K., but what if it's stored some other country? What if the information is stored in some other country? Can — is — is an audit even possible?

ZUCKERBERG: Well, senator, we believe a bunch of the information that we — that we will be able to audit. I think you raise an important question and if we have issues, then we — if we are not able to do an audit to our satisfaction, we are going to take legal action to enable us to do that. And if — and also, I know that the U.K. and U.S. governments are also involved in working on this as well.

TESTER: Yes, I don't — I don't really — I'm telling you, I — I have faith in the U.S. government. I really actually have faith in the U.K. too. I — there have been claims that this information is being stored in Russia. I don't care, it could be stored anywhere in the world. I don't know how you get access to that information. I'm not as smart as you are about tech information.

And so the question really becomes — and I got to move on — but the question is I don't see how you can perform a full audit if they've got stuff stored somewhere else that we can't get access to. That's all. Maybe you have other ideas on how to do that.

ZUCKERBERG: Well, I think we'll know once we get in there whether we feel like we can fully investigate everything.

TESTER: Just real quickly. Senator Schatz asked a question earlier about — about data and who owns the data. I want to dig into it a little bit more. You said — and I think multiple times during this hearing — that I own the data on Facebook if it's my data.

ZUCKERBERG: Yes.

TESTER: And — and I'm going to tell you that I think that that sounds really good to me. But in practice — let's think about this for a second. You're making about $40 billion bucks a year on the data. I'm not making any money on it. It feels like you own the data. And in fact, I would say that the — the data that was — that was breached through Cambridge Analytic, which impacted — and correct me if these numbers are wrong — some 80 million Americans.

TESTER: My guess is that few, if any, knew that that information was being breached. If I own that data, I know it's being breached. So could — could you give me some sort of idea on how you can really honestly say it's my data when, quite frankly, they may have goods on me. I don't — I don't want them to have any information on me.

ZUCKERBERG: Senator, when I say ...

TESTER: Because if I own it, I can stop it.

ZUCKERBERG: Yes. So, senator, when I say it's your data, what we mean is that you have control over how its used on Facebook. You clearly need to give Facebook a license to use it within our system.

TESTER: Yes.

ZUCKERBERG: Or else — or else the service doesn't work.

TESTER: Yes, I know and this license has brought up — been brought up many times a day, and I'm going to be quiet in just one second, Mr. Chairman. But the fact is, is the license is very thick, maybe intentionally, so people get tired of reading it, and don't want to.

Look, Mark, I appreciate you being here. I look forward to having another hearing. Thank you.

THUNE: Senator Young.

YOUNG: Mr. Zuckerberg, thanks so much being here and enduring the many questions today. I think its important you're here, because social media — your social media platform happens to be the ubiquitous social media platform, and there's not a senator that you heard from today that isn't on Facebook, that doesn't communicate with our constituents through Facebook. In a sense, we have to be on it, and so I think its especially important that you're here, not just for Facebook, but really for our country and beyond.

The threshold question that — that continues to emerge here today is what are the reasonable expectations of privacy that users ought to have? And I'll tell you my neighbors are unsatisfied by an answer to that question that involves, you know, “take a look at the user agreement.” And I — I think there's been a fair amount of discussion here about whether or not people actually read that user agreement. I would encourage you to, you know, survey that, get all the information you can with respect to that, and make sure that — make sure that user agreement is easy to understand and streamlined and so forth.

Mr. Zuckerberg, earlier in today's hearing you drew a distinction that I thought was interesting. It caught my attention. It was a distinction between the consumer expectation of privacy depending upon whether they were on an ISP or “the pipes of the Internet,” as you characterized it, or on an Edge platform, like Facebook.

I find this distinction somewhat unsatisfying, because most folks who use the Internet just think of it as one place, if you will. They think of it as “the Internet,” as opposed to various places requiring different degrees of privacy.

Could you — could you speak to this issue and indicate whether you'd support a comprehensive privacy policy that applies in the same manner to all entities across the entire net — Internet ecosystem.

ZUCKERBERG: Senator, sure. I think that people's expectations of how they use these different systems are different. Some thing — some apps are very lightweight and as are — and you can fully encrypt the data going across them in a way that the app developer or the — the pipes, in the ISP case.

You probably shouldn't be able to see any of the content, and I — I think you probably should have a full expectation that no one is going to be introspecting or looking at that content.

(CROSSTALK)

YOUNG: Give me some quick examples, if you would kindly, sir.

ZUCKERBERG: Sure. Well, when data is going over the Verizon network, I think it would be good for that to be as encrypted as possible, and such that Verizon wouldn't look at it, right? I think that's what people expect, and I don't know that being able to look at the data is required to — to deliver their service.

That's how WhatsApp works too, so that's an app. It's a very lightweight app. It doesn't require us to know a lot of information about you, so we can offer that with full encryption, and therefore, we're not looking — we don't see the content.

For a service like Facebook or Instagram, where you're sharing photos and then they — people want to access them from lots of different places. People kind of want to store that in a central place, so that way they can go access it from — from a lot of different devices.

In order to do that, we need to have an understanding of what that content is. So I think the — the expectations of — of what Facebook will have knowledge of versus what an ISP will have knowledge of are just different.

YOUNG: I think that needs to be clearly communicated to your users and — and we'll leave it at that. That — that those — those — those different levels of privacy that the user can expect to enjoy when they're on your platform. I'd like to sort of take a different tact to Internet privacy policy with you sir.

Might we create stronger privacy rights for consumers either through creating a stronger general property right regime online; say a new law that states unequivocally something that you said before, that users own their online data or through stronger affirmative opt in requirements on platforms like yours. Now if we were to do that, would you need to retool your model? If we were to adopt one of those two approaches?

ZUCKERBERG: Senator, could you repeat what the approaches are again?

YOUNG: Yes. So one is to create a stronger property right for the individual online through a law, that states unequivocally users own their data. The other one is a stronger affirmative opt in requirement to be a user on Facebook. Would you have to fundamentally change the Facebook architecture to accommodate those policies?

ZUCKERBERG: Senator, those policies and the principles that you articulated are generally how we view our service already. So depending on the details of what — what your — the proposal actually ends up being — and the details do just matter a huge amount here — it's not clear that it would be a fundamental shift.

But the details really matter and if this is something you're considering or working on, we would love to follow up with you on this because this is very important to get right.

YOUNG: I'd love to work with you. I'm out of time. Thank you.

GRASSLEY: Senator Thune has a closing comment.

THUNE: Just a ...

GRASSLEY: ... and I have a process statement for everybody to listen to.

THUNE: Mr. Chairman thank you and — and thanks to all of our members for their patience; been a long hearing, particularly long hearing for you Mr. Zuckerberg. Thank you for — for sitting through this. But I think this is important. I do have a letter here from the Motion Picture Association of America that I want to get into the record without objection.

GRASSLEY: Without objection, so ordered.

THUNE: And then — and just a quick — quick sort of rap up question if you will and maybe one quick comment. But you've answered several questions about — today about efforts to keep bad actors, whether that's a terrorist group to a malicious foreign agent off of your platform.

You're also heard concerns about bias at Facebook, particularly bias against conservatives. And — and I just as a final question, can you assure us that when you are improving tools to stop bad actors, that you will err on the side of protecting speech especially political speech from all different corners?

ZUCKERBERG: Senator, yes. That's our — that's our approach. If there is an eminent threat of harm, we're going to take conservative position on that and make sure that we flag that and understand that more broadly. But overall, I want to make sure that we provide people with the most voice possible. I want the widest possible expression and I don't want anyone at our company to make any decisions based on the — the political ideology of the content.

THUNE: Okay. And just one final observation Chairman Grassley, Mr. Zuckerberg's answered a lot of questions today but there are also a lot of questions today, but there are also a lot of promises to follow up with some of our members and sometimes on questions about Facebook practices that seem fairly straightforward, but I don't think we have — I think it's going to be hard for us to fashion solutions to — to solve some of this stuff until we have some of those answers.

And you had indicated earlier that you're continuing to try and find out who among these other analytics companies may have had access to user that — that they were able to use. And hopefully as you get those answers, you will be able to forward those to — to us and it'll help shape our thinking in terms of how — where we go from here. So — but overall I think it's a very informative hearing, Mr. Chairman, and — and — so I'm — I'm ready to wrap it up.

GRASSLEY: Yes, I probably wouldn't make this comment, but you're response to him in regard to political speech, I won't identify the CEO I had a conversation with yesterday, but one of our platforms — and he admitted to being more or left than right, or I mean being left I guess is what he admitted and I don't want to — I'm not asking you what you are, but it — but just so you understand that — that probably as liberals have a lot of concerns about, you know, the leaning of — of Fox News or conservatives have questions about the leaning of — of MSNBC let's say.

It seems to me that when you — when we get — whether it's from the right or the left, so I'm speaking to you for your platform, there's a great deal of cynicism in American society about government generally.

And then when there is suspicions, legitimate or not, that maybe you're playing on one way unfairly towards the other, it seems to me that everything you can do to lean over backwards to make sure that you are fair in protecting political speech, right or left, that you ought to do it.

And I'm not telling you how to do it, and I'm not saying you don't do it, but we've — we got to do something that reduces cynicism. At my town meetings in Iowa, I always get this question, how come you guys in D.C. can't get along?

You know, meaning Republicans and Democrats. Well I try to explain to them that they kind of get a obtuse — what would you say — review of what goes on here, because controversy makes news, so if people are getting along, you never hear about that.

So they get a distorted view of it, and — and really we — congressmen get along more than the public thinks. But these attitudes of the public, we've got to change and people of your position and your influence, you can do a lot to change this.

Whether I know you got plenty time around your corporation, through your corporation or privately, anything you can do to reduce this cynicism because we have a — a perfect constitution, maybe it's not perfect, but we got a very good constitution — the longest one written constitution in the history of man — mankind.

And — but if people don't have faith in the institutions of government, and then it's — it's our responsibility to enhance that faith so they have less cynicism in us, you know, we don't have a very strong democracy just because we've got a good constitution.

GRASSLEY: So I hope that everybody will do whatever they can to help enhance respect for government, including speaking to myself. I got to bend over backwards to do what I can so they don't — so I don't add to that cynicism.

So, I'm sorry you had to listen to me.

(LAUGHTER)

And so, this concludes today's hearing. Thanks to all the witnesses (sic) for attending.

The record will be open for 14 days for the members to submit additional written questions and for the witness, Mr. Zuckerberg, to make any corrections to his testimony.

The hearing is adjourned.

List of Panel Members and Witnesses
PANEL MEMBERS:

SENATE COMMERCE, SCIENCE AND TRANSPORTATION COMMITTEE

SEN. JOHN THUNE, R-S.D., CHAIRMAN

SEN. ROGER WICKER, R-MISS.

SEN. ROY BLUNT, R-MO.

SEN. DEAN HELLER, R-NEV.

SEN. TED CRUZ, R-TEX.

SEN. DEB FISCHER, R-NEB.

SEN. RON JOHNSON, R-WIS.

SEN. CORY GARDNER, R-COLO.

SEN. JERRY MORAN, R-KAN.

SEN. DAN SULLIVAN, R-ALASKA

SEN. JAMES M. INHOFE, R-OKLA.

SEN. SHELLEY MOORE CAPITO, R-W.VA.

SEN. MIKE LEE, R-UTAH

SEN. TODD C. YOUNG, R-IND.

SEN. BILL NELSON, D-FLA., RANKING MEMBER

SEN. MARIA CANTWELL, D-WASH.

SEN. AMY KLOBUCHAR, D-MINN.

SEN. RICHARD BLUMENTHAL, D-CONN.

SEN. BRIAN SCHATZ, D-HAWAII

SEN. EDWARD J. MARKEY, D-MASS.

SEN. GARY PETERS, D-MICH.

SEN. TOM UDALL, D-N.M.

SEN. TAMMY BALDWIN, D-WIS.

SEN. TAMMY DUCKWORTH, D-ILL.

SEN. MAGGIE HASSAN, D-N.H.

SEN. CATHERINE CORTEZ MASTO, D-NEV.

SEN. JON TESTER, D-MONT.

SENATE JUDICIARY COMMITTEE

SEN. CHARLES E. GRASSLEY, R-IOWA, CHAIRMAN

SEN. ORRIN G. HATCH, R-UTAH

SEN. LINDSEY O. GRAHAM, R-S.C.

SEN. JOHN CORNYN, R-TEX.

SEN. MIKE LEE, R-UTAH

SEN. TED CRUZ, R-TEX.

SEN. JEFF FLAKE, R-ARIZ.

SEN. THOM TILLIS, R-N.C.

SEN. BEN SASSE, R-NEB.

SEN. MICHAEL D. CRAPO, R-IDAHO

SEN. JOHN NEELY KENNEDY, R-LA.

SEN. DIANNE FEINSTEIN, D-CALIF., RANKING MEMBER

SEN. PATRICK J. LEAHY, D-VT.

SEN. RICHARD J. DURBIN, D-ILL.

SEN. SHELDON WHITEHOUSE, D-R.I.

SEN. AMY KLOBUCHAR, D-MINN.

SEN. CHRISTOPHER A. COONS, D-DEL.

SEN. RICHARD BLUMENTHAL, D-CONN.

SEN. MAZIE K. HIRONO, D-HAWAII

SEN. CORY BOOKER, D-N.J.

SEN. KAMALA D. HARRIS, D-CALIF.

WITNESSES:

FACEBOOK CEO MARK ZUCKERBERG TESTIFIES

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### Terms of Sale for Digital Products - The Washington Post

Terms of Sale for Digital Products

Updated June 4, 2018

This Terms of Sale governs the sale of Washington Post Digital Products (the “Digital Products”). By using the Digital Products, you also agree to our Terms of Service and Privacy Policy.

  1. Digital Products

    The Washington Post Digital Products include the website (www.washingtonpost.com), mobile site, and tablet and mobile apps. You are not necessarily required to purchase anything to use The Post’s Digital Products. If you do not purchase a subscription or product, however, your access to the Digital Products will be limited

    You can view The Post’s various subscription offerings at https://subscribe.washingtonpost.com. We also offer gift subscriptions at https://subscribe.washingtonpost.com/gift.

    The Post reserves the right to modify the content, type and availability of any Digital Product at any time.

  2. Subscription

    a. Auto-renewing Subscription. Your Digital Products subscription, which may start with a promotional rate, will auto-renew at the end of the cycle stated at the time of your order (“Billing Period”) unless and until you cancel your subscription or we terminate it. You can view the date of your next scheduled payment by visiting our website and clicking on the “My Account” link. You will not receive a notice from us that your promotional period has ended or that your subscription has auto-renewed.

    b. Differing Subscriptions/Promotions. The Post may offer a number of types of subscriptions, including subscriptions to different Washington Post products and special promotions. Any materially different terms from those described in these Terms of Sale will be disclosed at the time of purchase or in other communications made available to you. You can find specific details regarding your subscription visiting our website and clicking on the “My Account” link. We reserve the right to change or terminate any offered subscriptions or promotions at any time.

    c. Premium EU Subscription.Users in the European Economic Area (“EEA”) and Switzerland have the option to subscribe to our Premium EU product, which offers unlimited access to www.washingtonpost.com (the “Site”) on any device, as well as all of our apps. The Premium EU product has no advertising or third-party ad tracking on the Site. Please note, however, that the Site, may include content embedded from other sites and services (such as Facebook, Twitter, and YouTube) and that such third-party content may contain some advertising and third-party ad tracking from those sites or services. In addition, our apps and newsletters may contain advertising and third-party ad tracking. Please view our Privacy Policy for more information.

  3. Billing

    a. Payment Method. You can pay for your subscription with a major credit card, or by Amazon Pay or PayPal (“Payment Method”). Only credit cards, Amazon Pay or PayPal are eligible for payment. Do not sign up for a subscription by identifying a debit card in the credit card option. A debit card may also be known as a “check” or “ATM” card and typically has the word “debit” on it. You may edit your Payment Method information by visiting our website and clicking on the “My Account” link. If your payment is unsuccessful by reason of insufficient funds, expiration, or otherwise, you remain responsible for any uncollected amount.

    b. Recurring Billing. By placing an order for a subscription, you authorize us to charge you the subscription fee then in effect at the beginning of each Billing Period, plus applicable taxes, to your Payment Method. You acknowledge that the amount charged each Billing Period may vary for reasons that may include price changes, changing your subscription, or changes in applicable taxes, and you authorize us to charge your Payment Method for such varying amount each Billing Period.Applicable taxes may vary. For example, you authorize us to charge your Payment Method the promotional rate disclosed on the subscription screen in the initial Billing Period (if applicable) and the regular subscription rate in subsequent Billing Periods, each with applicable taxes. We automatically bill your Payment Method on the last day of each Billing Period. We reserve the right to change the timing of our billing, in particular, in the event your Payment Method has not successfully settled. If your Payment Method is declined for a recurring payment of your subscription fee, you have four (4) days to provide us a new Payment Method or your subscription will be canceled.

    c. Price Changes. We reserve the right to change subscription fees for any of our subscriptions at any time. We will notify you of any changes if the regular fee for your subscription changes from what was stated at the time of your initial order. You will have an opportunity to cancel your subscription at that time. If you do not cancel your subscription, you will be charged the new subscription fee at your next Billing Period.

    d. Billing Period. We will charge the subscription fee at the commencement of your subscription or, if applicable, at the end of your free trial period, and automatically on the first calendar day of each Billing Period thereafter unless and until your subscription is cancelled.

    e. One-Time Purchases. When you purchase a stand-along product, such as a gift subscription, we will charge your Payment Method at the time of purchase.

  4. Cancellations and Refunds

    a. Cancellations. You can cancel your subscription at any time by going to My Account. You must cancel your subscription before it renews each Billing Period to avoid billing of the next Billing Period’s subscription fees to your Payment Method. Accordingly, when you cancel, you cancel only future charges associated with your subscription, and you will not receive a refund for the current Billing Period. Your cancellation will become effective at the end of your current Billing Period, and you will continue to have access to your subscription for the balance of the Billing Period. Any purchases of a Digital Product subscription through a third party (e.g., app store), are subject to that third party’s cancellation policies and procedures.

    b. Refunds. Payments are non-refundable, and there are no refunds or credits for partially used Billing Periods. We reserve the right, however, to issue refunds or credits at our sole discretion. If we issue a refund or credit in one instance, we are under no obligation to issue the same refund or credit in the future.

  1. E-Sign Disclosure and Consent. By purchasing a Digital Products subscription and/or clicking on the box at account opening, you consent to receive notices, disclosures, agreements, policies, receipts, confirmations, transaction information, account information, other communications, and changes or updates to any such documents electronically (collectively, the “Electronic Communications”). We will provide these Electronic Communications by posting them on the Washington Post website, the “My Account” page, and/or emailing them to your primary email address associated with your Digital Products subscription. You agree that the Electronic Communications will satisfy any legal communication requirements, including that such communications be in writing. Electronic Communications will be deemed received by you within 24 hours of the time posted to our website or the “My Account” page, or within 24 hours of the time emailed to you unless we receive notice that the email was not delivered.

    a. System Requirements to Access Information. To receive Electronic Communications, you must have the following equipment and software:

    • ● a computer or other device with an Internet connection;
    • ● a current web browser that includes 128-bit encryption (e.g. Internet Explorer version 6.0 and above, Firefox version 2.0 and above, Chrome version 3.0 and above, or Safari 3.0 and above) with cookies enabled;
    • ● Adobe Acrobat Reader version 8.0 and above to open documents in .pdf format;
    • ● a valid email address (your primary email address associated with the Digital Products Subscription); and
    • ● sufficient storage space or other methods (e.g., a USB drive or secure online storage) to save past Electronic Communications or a printer to print them.

    Your access to this page verifies that your system/device meets these requirements. You also confirm that you have access to the necessary equipment and are able to receive, open, print, or store Electronic Communications.

    It is your responsibility to keep your primary email address up to date. You can change your primary email address on the “My Account” page. You agree that Electronic Communications sent to a primary email address that is incorrect, out of date, blocked by your service provider, or cannot be received due to your failure to maintain the system requirements, will be deemed to have been provided to you. If an Electronic Communication is returned to us because an email your address becomes invalid, we may deem your subscription to be inactive, and you will not be able to use the Digital Products until we receive a valid, working primary email address from you.

    We will notify you if there are any material changes to the hardware or software needed to receive Electronic Communications.

    b. Paper Delivery of Disclosures and Notices. You have the right to receive a paper copy of the Electronic Communications. To receive a paper copy at no charge, please request it in one of the following ways: (1) go to the Washington Post Help Desk www.washingtonpost.com/contactus and send us a message requesting a paper copy of Electronic Communications and include your name, email address and mailing address; or (2) call us at 202-334-6100 and speak to the customer service representative. Any withdrawal of your consent to receive Electronic Communications will be effective only after we have a reasonable period of time to process your withdrawal. You understand and agree that if you withdraw your consent we may – though we are not required to – cancel your Digital Products subscription.

  2. Changes to the Terms of Sale. We may, from time to time, change these Terms of Sale. When such changes are made, we will make a copy of the new Terms of Sale available to you on our website.
#####EOF##### Pentagon faces internal questions about program to screen recruits with foreign ties, emails show - The Washington Post

Pentagon faces internal questions about program to screen recruits with foreign ties, emails show

Officials have touted the program as a way to speed up vetting of recruits who have what the Pentagon considers “foreign nexus” risks.


Marine Sgt. Edson Mejia Jimenez, left, originally from Colombia, and Army Pvt. Sehyeon Park, originally from South Korea, take the Oath of Allegiance along with other citizenship candidates during a naturalization ceremony in Virginia on Feb. 22, 2016. (Michael Reynolds/European Pressphoto Agency)

A Pentagon program designed to screen potential recruits with foreign ties, including green-card holders and some U.S. citizens, has prompted questions from military officials about whether it will have detrimental effects on the services, according to emails and documents obtained by The Washington Post.

Defense officials touted the program as a way to speed up vetting of recruits who have what the Pentagon considers “foreign nexus” risks. The process could be completed “in a matter of days or . . . in a few weeks, as compared to months and years” required under traditional background checks, according to one Defense Department memo.

The program, which was tested by the Army last summer but has not been implemented, would rely on mining several government databases for information.

But the plan also may come with complications, according to emails obtained by The Post. That would be a concern for a military that has long sought to attract immigrants to meet its recruiting goals in part by promoting the possibility of U.S. citizenship.

Discussions about the program began in earnest after a federal judge issued a preliminary injunction in November ordering the Pentagon to begin sending a backlog of thousands of green-card recruits to initial training. The order came after two prospective recruits — one born in China and interested in joining the Navy and one originally from Jamaica who planned to join the Air Force — sued the Pentagon, arguing that months-long delays in screening had caused them harm.

The two men were among thousands who were left in limbo after the Trump administration, citing security concerns, adopted a policy in October 2017 that called for green-card holders to submit to more stringent background checks before they could go to boot camp. That was in addition to standard requirements for green-card applicants, such as biometrics screening.

The program would need approval in court to overcome the injunction. But internally, some defense officials have expressed concern that it also will create some delays.

Russ Beland, a senior civilian official in the Navy Department, said in a Feb. 27 email obtained by The Post that the estimates officials were using to determine which recruits needed additional screening “may be far too low.” After assessing its pool of recruits waiting to go to initial training, the Navy determined that “somewhere between a third and half” could require new screening, he wrote.

“I recognize there are risks from inadequate screening, but there are also risks from gapped billets,” Beland said, using military parlance for empty slots in training.

In response, Lernes Hebert, a senior defense official overseeing personnel issues, said he was committed to working with the Navy Department on exceptions to the policy “if class seats are at risk of going vacant.” In that case, he wrote, the Pentagon would require tracking recruits who are identified for additional screening to be completed “as soon as possible” while they make their way through initial stages of training.

Such exceptions would be rare, Hebert predicted, and would require Pentagon approval.

Beland said he had concerns about that, too. By the time a recruiting command became aware of concerns about a recruit, it could be too late, he wrote. If every case must go up to that level at the Pentagon, he added, it “does not sound workable to me if we encounter widespread delays.”

Beland, in an email, said that he could not comment on the messages because the policy is “in a pre-decisional state.”

Hebert referred comment to the Pentagon’s public affairs office.

Air Force Lt. Col. Carla Gleason, a Pentagon spokeswoman, said that she was unable to address questions but that the Defense Department needs “every qualified patriot who is willing and able to serve.” As of May 2018, about 19,800 noncitizens were among the nation’s 1.2 million active-duty service members.

The Trump administration’s new restrictions on service members with foreign ties also has included the end of a program begun in 2008 to attract foreign recruits with key medical and language skills. That effort, known as the Military Accessions Vital to National Interests (MAVNI) program, offered a path to citizenship but ended in 2017 after U.S. officials concluded it was vulnerable to insider threats.

The Pentagon began discharging some service members who joined the military under MAVNI, but suspended the process in 2018. In a lawsuit brought by 17 U.S. service members who became U.S. citizens through MAVNI, lawyers argued during a trial late last year that the Pentagon was treating them differently from other citizens by requiring them to undergo extensive biannual screening.

In January, U.S. District Judge Thomas S. Zilly found in the MAVNI troops’ favor, ruling that the Pentagon had not met its burden of proof to require the screening.

During the trial, Stephanie Miller, a senior Pentagon official involved in recruiting, said the Defense Department Inspector General and intelligence agencies had warned defense officials that “direct threats for espionage” had been identified in the MAVNI program and that “hostile governments” were targeting it.

Under questioning, Miller said that in the program’s nearly 10-year history, one person who attempted to join through MAVNI had been charged in an espionage case. That person had not yet obtained U.S. citizenship or a security clearance. More than 10,000 U.S. troops joined the military through the program.

Miller referred questions to the Pentagon’s public affairs office.

In the other pending case, the American Civil Liberties Union and the law firm Latham & Watkins have argued in federal court that obtaining a green card already requires significant screening and that requiring even more is not only discriminatory but also harms the Armed Forces by withholding recruits.

The Justice Department, arguing on behalf of the Pentagon, has countered that researching the background of someone who was not born in the United States can be difficult and that some recruits had falsified information while seeking security clearances. The case could go to trial this year.

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read National
Read content from allstate
Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
We went to the source. Here’s what matters to millennials.
A state-by-state look at where Generation Y stands on the big issues.
#####EOF##### Mark Zuckerberg: The Internet needs new rules. Let’s start in these four areas. - The Washington Post

Mark Zuckerberg: The Internet needs new rules. Let’s start in these four areas.


Facebook, Messenger and Instagram apps on an iPhone screen. (Jenny Kane/AP)

Mark Zuckerberg is founder and chief executive of Facebook.

Technology is a major part of our lives, and companies such as Facebook have immense responsibilities. Every day, we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks. These are important for keeping our community safe. But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone.

I believe we need a more active role for governments and regulators. By updating the rules for the Internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms.

From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.

First, harmful content. Facebook gives everyone a way to use their voice, and that creates real benefits — from sharing experiences to growing movements. As part of this, we have a responsibility to keep people safe on our services. That means deciding what counts as terrorist propaganda, hate speech and more. We continually review our policies with experts, but at our scale we’ll always make mistakes and decisions that people disagree with.

Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own. So we’re creating an independent body so people can appeal our decisions. We’re also working with governments, including French officials, on ensuring the effectiveness of content review systems.

Internet companies should be accountable for enforcing standards on harmful content. It’s impossible to remove all harmful content from the Internet, but when people use dozens of different sharing services — all with their own policies and processes — we need a more standardized approach.

One idea is for third-party bodies to set standards governing the distribution of harmful content and to measure companies against those standards. Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.

Facebook already publishes transparency reports on how effectively we’re removing harmful content. I believe every major Internet service should do this quarterly, because it’s just as important as financial reporting. Once we understand the prevalence of harmful content, we can see which companies are improving and where we should set the baselines.

Second, legislation is important for protecting elections. Facebook has already made significant changes around political ads: Advertisers in many countries must verify their identities before purchasing political ads. We built a searchable archive that shows who pays for ads, what other ads they ran and what audiences saw the ads. However, deciding whether an ad is political isn’t always straightforward. Our systems would be more effective if regulation created common standards for verifying political actors.

Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference. Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting. We believe legislation should be updated to reflect the reality of the threats and set standards for the whole industry.

Third, effective privacy and data protection needs a globally harmonized framework. People around the world have called for comprehensive privacy regulation in line with the European Union’s General Data Protection Regulation, and I agree. I believe it would be good for the Internet if more countries adopted regulation such as GDPR as a common framework.

New privacy regulation in the United States and around the world should build on the protections GDPR provides. It should protect your right to choose how your information is used — while enabling companies to use information for safety purposes and to provide services. It shouldn’t require data to be stored locally, which would make it more vulnerable to unwarranted access. And it should establish a way to hold companies such as Facebook accountable by imposing sanctions when we make mistakes.

I also believe a common global framework — rather than regulation that varies significantly by country and state — will ensure that the Internet does not get fractured, entrepreneurs can build products that serve everyone, and everyone gets the same protections.

As lawmakers adopt new privacy regulations, I hope they can help answer some of the questions GDPR leaves open. We need clear rules on when information can be used to serve the public interest and how it should apply to new technologies such as artificial intelligence.

Finally, regulation should guarantee the principle of data portability. If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.

This is important for the Internet — and for creating services people want. It’s why we built our development platform. True data portability should look more like the way people use our platform to sign into an app than the existing ways you can download an archive of your information. But this requires clear rules about who’s responsible for protecting information when it moves between services.

This also needs common standards, which is why we support a standard data transfer format and the open source Data Transfer Project.

I believe Facebook has a responsibility to help address these issues, and I’m looking forward to discussing them with lawmakers around the world. We’ve built advanced systems for finding harmful content, stopping election interference and making ads more transparent. But people shouldn’t have to rely on individual companies addressing these issues by themselves. We should have a broader debate about what we want as a society and how regulation can help. These four areas are important, but, of course, there’s more to discuss.

The rules governing the Internet allowed a generation of entrepreneurs to build services that changed the world and created a lot of value in people’s lives. It’s time to update these rules to define clear responsibilities for people, companies and governments going forward.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Tech touches every issue. That’s why Congress needs to keep up. - The Washington Post
Letters to the Editor

Tech touches every issue. That’s why Congress needs to keep up.


Facebook CEO Mark Zuckerberg appears for a hearing at the Hart Senate Office Building on April 10. (Matt McClain/The Washington Post)

In the Sept. 18 editorial “Tech tutors for Congress,” The Post encouraged Congress to restart the Office of Technology Assessment (OTA), asserting that Capitol Hill needs resources to grapple with how technology is reshaping society. That is absolutely true — tech expertise would help Congress become more effective. I know because I’ve been embedding technologists in Congress for the past three years.

When I was working for then-Rep. Henry A. Waxman (D-Calif.), I needed tech expertise and advice. That’s why I founded TechCongress, which places technologists in congressional offices through a one-year fellowship. Of the 3,500 legislative staffers in Congress, I’ve found just seven who have formal technical training — meaning most staff are unprepared or reliant on interest groups to understand complex technology issues.

The Post notes that Congress “needs to understand what it is doing,” and a recent independent evaluation of TechCongress shows that technologists will help. Fellows act as fact-checkers on interest groups and have provided support on issues ranging from privacy and cybersecurity to drones and biotechnology.

Tech increasingly touches every issue, and Congress needs tech expertise in-house to keep up. Restarting OTA will do just that.

Travis Moore, San Francisco

The writer is the founder of TechCongress.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Hospital viruses: Fake cancerous nodes in CT scans, created by malware, trick radiologists - The Washington Post

Hospital viruses: Fake cancerous nodes in CT scans, created by malware, trick radiologists

Researchers in Israel created malware to draw attention to serious security weaknesses in medical imaging equipment and networks.


(iStock) (JohnnyGreig/(iStock))

When Hillary Clinton stumbled and coughed through public appearances during her 2016 presidential run, she faced critics who said that she might not be well enough to perform the top job in the country. To quell rumors about her medical condition, her doctor revealed that a CT scan of her lungs showed that she just had pneumonia.

But what if the scan had shown faked cancerous nodules, placed there by malware exploiting vulnerabilities in widely used CT and MRI scanning equipment? Researchers in Israel say they have developed such malware to draw attention to serious security weaknesses in critical medical imaging equipment used for diagnosing conditions and the networks that transmit those images — vulnerabilities that could have potentially life-altering consequences if unaddressed.

The malware they created would let attackers automatically add realistic, malignant-seeming growths to CT or MRI scans before radiologists and doctors examine them. Or it could remove real cancerous nodules and lesions without detection, leading to misdiagnosis and possibly a failure to treat patients who need critical and timely care.

Yisroel Mirsky, Yuval Elovici and two others at the Ben-Gurion University Cyber Security Research Center in Israel who created the malware say that attackers could target a presidential candidate or other politicians to trick them into believing they have a serious illness and cause them to withdraw from a race to seek treatment.

The research isn’t theoretical. In a blind study the researchers conducted involving real CT lung scans, 70 of which were altered by their malware, they were able to trick three skilled radiologists into misdiagnosing conditions nearly every time. In the case of scans with fabricated cancerous nodules, the radiologists diagnosed cancer 99 percent of the time. In cases where the malware removed real cancerous nodules from scans, the radiologists said those patients were healthy 94 percent of the time.

Even after the radiologists were told that the scans had been altered by malware and were given a second set of 20 scans, half of which were modified, they still were tricked into believing the scans with fake nodules were real 60 percent of the time, leading them to misdiagnoses involving those patients. In the case of scans where the malware removed cancerous nodules, doctors did not detect this 87 percent of the time, concluding that very sick patients were healthy.

The researchers ran their test against a lung-cancer screening software tool that radiologists often use to confirm their diagnoses and were able to trick it into misdiagnosing the scans with false tumors every time.

“I was quite shocked,” said Nancy Boniel, a radiologist in Canada who participated in the study. “I felt like the carpet was pulled out from under me, and I was left without the tools necessary to move forward.”

The study focused on lung cancer scans only. But the attack would work for brain tumors, heart disease, blood clots, spinal injuries, bone fractures, ligament injuries and arthritis, Mirsky said.

Attackers could choose to modify random scans to create chaos and mistrust in hospital equipment, or they could target specific patients, searching for scans tagged with a specific patient’s name or ID number. In doing this, they could prevent patients who have a disease from receiving critical care or cause others who aren’t ill to receive unwarranted biopsies, tests and treatment. The attackers could even alter follow-up scans after treatment begins to falsely show tumors spreading or shrinking. Or they could alter scans for patients in drug and medical research trials to sabotage the results.

The vulnerabilities that would allow someone to alter scans reside in the equipment and networks hospitals use to transmit and store CT and MRI images. These images are sent to radiology workstations and back-end databases through what’s known as a picture archiving and communication system (PACS). Mirsky said the attack works because hospitals don’t digitally sign the scans to prevent them from being altered without detection and don’t use encryption on their PACS networks, allowing an intruder on the network to see the scans and alter them.

“They’re very, very careful about privacy … if data is being shared with other hospitals or other doctors,” Mirsky said, “because there are very strict rules about privacy and medical records. But what happens within the [hospital] system itself, which no regular person should have access to in general, they tend to be pretty lenient [about]. It’s not ... that they don’t care. It’s just that their priorities are set elsewhere.”

Although one hospital network they examined in Israel did try to use encryption on its PACS network, the hospital configured the encryption incorrectly and as a result the images were still not encrypted.

Fotios Chantzis, a principal information-security engineer with the Mayo Clinic in Minnesota who did not participate in the study but confirmed that the attack is possible, said that PACS networks are generally not encrypted. That’s in part because many hospitals still operate under the assumption that what’s on their internal network is inaccessible from outside — even though “the era where the local hospital network was a safe, walled garden is long gone,” he said.

Although encryption is available for some PACS software now, it’s still generally not used for compatibility reasons. It has to communicate with older systems that don’t have the ability to decrypt or re-encrypt images.

To develop their malware, the Israeli researchers used machine learning to train their code to rapidly assess scans passing through a PACS network and to adjust and scale fabricated tumors to conform to a patient’s unique anatomy and dimensions to make them more realistic. The entire attack can be fully automated so that once the malware is installed on a hospital’s PACS network, it will operate independently of the researchers to find and alter scans, even searching for a specific patient’s name.

To get the malware onto a PACS network, attackers would need either physical access to the network — to connect a malicious device directly to the network cables — or they could plant malware remotely from the Internet. The researchers found that many PACS networks are either directly connected to the Internet or accessible through hospital machines that are connected to the Internet.

To see how easy it would be to physically install malware on a PACS network, Mirsky conducted a test at a hospital in Israel that the researchers videotaped. He was able to enter the radiology department after hours and connect his malicious device to the network in just 30 seconds, without anyone questioning his presence. Although the hospital had given permission for the test, staff members didn’t know how or when Mirsky planned to carry it out.

To prevent someone from altering CT and MRI scans, Mirsky says, ideally hospitals would enable end-to-end encryption across their PACS network and digitally sign all images while also making sure that radiology and doctor workstations are set up to verify those signatures and flag any images that aren’t properly signed.

Suzanne Schwartz, a medical doctor and the Food and Drug Administration’s associate director for Science and Strategic Partnerships, who has been leading some of the FDA’s effort to secure medical devices and equipment, expressed concern about the findings of the Israeli researchers. But she said many hospitals don’t have the money to invest in more secure equipment, or they have 20-year-old infrastructure that doesn’t support newer technologies.

“It’s going to require changes that go well beyond devices, but changes with regards to the network infrastructure,” Schwartz said. “This is where engaging and involving with other authorities and trying to bring the entire community together becomes really important.”

Christian Dameff, an emergency room physician with the University of California at San Diego School of Medicine and a security researcher who has exposed vulnerabilities with the 911 emergency calling system, notes that in the case of a cancer diagnosis, some backstops would prevent a patient from receiving unwarranted treatment based only on a maliciously modified CT scan. But that doesn’t mean the attack would be harmless.

“There are a couple of steps before we just take someone to surgery" or prescribe radiation and chemotherapy, Dameff said. “But there is still harm to the patient regardless. There is the emotional distress [from learning you may have cancer], and there are all sorts of insurance implications.”

The radiologists in the BGU study recommended follow-up treatment and referrals to a specialist for all of the patients with scans that showed cancerous lung nodules. They recommended immediate tissue biopsies or other surgery for at least a third of them.

Correction: This story has been updated to reflect that the hospital in Israel didn’t encrypt any data passed over its network. An earlier version of the story said it had encrypted the metadata for the scans, which contains a patient’s name and medical ID.

#####EOF##### Philip Rucker - The Washington Post

Philip Rucker

Washington, D.C.

White House Bureau Chief Education: Yale University, B.A. in History, 2006
 Philip Rucker is the White House Bureau Chief for The Washington Post. He previously has covered Congress, the Obama White House, and the 2012 and 2016 presidential campaigns. Rucker also is a Political Analyst for NBC News and MSNBC. He joined The Post in 2005 as a local news reporter. 
Honors & Awards:
  • Pulitzer Prize for National Reporting, 2018, for coverage of Russian interference
  • George Polk Award, 2018, for coverage of Russian interference
  • Gerald R. Ford Journalism Prize, 2018, for distinguished reporting on the presidency
  • Sigma Delta Chi Award, 2018, for coverage of Russian interference
Professional Affiliations: Yale Daily News Foundation, Board Member
Latest from Philip Rucker

The president’s legal team conducted an extensive pressure campaign to keep him from coming face-to-face with federal investigators — fearful he would perjure himself.

  • Mar 28, 2019

The president and his supporters moved quickly to capi­tal­ize on the findings of the Russia investigation.

  • Mar 24, 2019

Despite few public details, those in the president’s orbit say they are heartened by the revelation that no additional Mueller indictments are forthcoming.

  • Mar 23, 2019

President’s actions in office, including his firing of FBI Director James B. Comey, were scrutinized by special counsel Robert S. Mueller as possible obstruction of justice.

  • Mar 22, 2019

From victim to bully and auditor to rebel, the president's performance at CPAC captured his unorthodox presidency.

  • Mar 9, 2019

The president’s “cost plus 50” formula has struck fear in the hearts of countries that host American troops.

  • Mar 9, 2019

The former Fox News executive is leaving the Trump administration after eight months.

  • Mar 8, 2019

Lawmakers are treading carefully when it comes to investigating Trump’s family. Some want to avoid them altogether; others say they must be held accountable.

  • Mar 7, 2019

Lawmakers are trying to sort out the competing narratives surrounding the president’s former fixer.

  • Mar 5, 2019

"I'm not going to campaign against someone I've been a friend with and worked with,” Corker said.

  • Apr 18, 2018
Load More
#####EOF##### Jamal Khashoggi: What the Arab world needs most is free expression - The Washington Post

Jamal Khashoggi: What the Arab world needs most is free expression


Jamal Khashoggi (Illustration by Alex Fine for The Washington Post)

A note from Karen Attiah, Global Opinions editor

I received this column from Jamal Khashoggi’s translator and assistant the day after Jamal was reported missing in Istanbul. The Post held off publishing it because we hoped Jamal would come back to us so that he and I could edit it together. Now I have to accept: That is not going to happen. This is the last piece of his I will edit for The Post. This column perfectly captures his commitment and passion for freedom in the Arab world. A freedom he apparently gave his life for. I will be forever grateful he chose The Post as his final journalistic home one year ago and gave us the chance to work together.

I was recently online looking at the 2018 “Freedom in the World” report published by Freedom House and came to a grave realization. There is only one country in the Arab world that has been classified as “free.” That nation is Tunisia. Jordan, Morocco and Kuwait come second, with a classification of “partly free.” The rest of the countries in the Arab world are classified as “not free.”

As a result, Arabs living in these countries are either uninformed or misinformed. They are unable to adequately address, much less publicly discuss, matters that affect the region and their day-to-day lives. A state-run narrative dominates the public psyche, and while many do not believe it, a large majority of the population falls victim to this false narrative. Sadly, this situation is unlikely to change.

The Arab world was ripe with hope during the spring of 2011. Journalists, academics and the general population were brimming with expectations of a bright and free Arab society within their respective countries. They expected to be emancipated from the hegemony of their governments and the consistent interventions and censorship of information. These expectations were quickly shattered; these societies either fell back to the old status quo or faced even harsher conditions than before.

My dear friend, the prominent Saudi writer Saleh al-Shehi, wrote one of the most famous columns ever published in the Saudi press. He unfortunately is now serving an unwarranted five-year prison sentence for supposed comments contrary to the Saudi establishment. The Egyptian government’s seizure of the entire print run of a newspaper, al-Masry al Youm, did not enrage or provoke a reaction from colleagues. These actions no longer carry the consequence of a backlash from the international community. Instead, these actions may trigger condemnation quickly followed by silence.

As a result, Arab governments have been given free rein to continue silencing the media at an increasing rate. There was a time when journalists believed the Internet would liberate information from the censorship and control associated with print media. But these governments, whose very existence relies on the control of information, have aggressively blocked the Internet. They have also arrested local reporters and pressured advertisers to harm the revenue of specific publications.

There are a few oases that continue to embody the spirit of the Arab Spring. Qatar’s government continues to support international news coverage, in contrast to its neighbors’ efforts to uphold the control of information to support the “old Arab order.” Even in Tunisia and Kuwait, where the press is considered at least “partly free,” the media focuses on domestic issues but not issues faced by the greater Arab world. They are hesitant to provide a platform for journalists from Saudi Arabia, Egypt and Yemen. Even Lebanon, the Arab world’s crown jewel when it comes to press freedom, has fallen victim to the polarization and influence of pro-Iran Hezbollah.

The Arab world is facing its own version of an Iron Curtain, imposed not by external actors but through domestic forces vying for power. During the Cold War, Radio Free Europe, which grew over the years into a critical institution, played an important role in fostering and sustaining the hope of freedom. Arabs need something similar. In 1967, the New York Times and The Post took joint ownership of the International Herald Tribune newspaper, which went on to become a platform for voices from around the world.

My publication, The Post, has taken the initiative to translate many of my pieces and publish them in Arabic. For that, I am grateful. Arabs need to read in their own language so they can understand and discuss the various aspects and complications of democracy in the United States and the West. If an Egyptian reads an article exposing the actual cost of a construction project in Washington, then he or she would be able to better understand the implications of similar projects in his or her community.

The Arab world needs a modern version of the old transnational media so citizens can be informed about global events. More important, we need to provide a platform for Arab voices. We suffer from poverty, mismanagement and poor education. Through the creation of an independent international forum, isolated from the influence of nationalist governments spreading hate through propaganda, ordinary people in the Arab world would be able to address the structural problems their societies face.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Jamal Khashoggi strangled, dismembered, Turkish prosecutor says, but body is still missing - The Washington Post

Prosecutor says Khashoggi was strangled and dismembered, but fate of body still a mystery

 Turkey’s top prosecutor on Wednesday laid out the most detailed description yet of how the journalist Jamal Khashoggi was killed, saying ­Saudi agents strangled him almost immediately after he entered the Saudi Consulate in Istanbul and then dismembered his body.

But the new information did not address the question that has bedeviled investigators and been the subject of furious speculation: What happened to Khashoggi’s remains?

A senior Turkish official said in an interview that Turkish authorities are pursuing a theory that Khashoggi’s dismembered body was destroyed in acid on the grounds of the Saudi Consulate or at the nearby residence of the Saudi consul general. Biological evidence discovered in the consulate garden supports the theory that Khashoggi’s body was disposed of close to where he was killed and dismembered, the official said.

“Khashoggi’s body was not in need of burying,” said the official, who spoke on the condition of anonymity to discuss a sensitive investigation.

While Saudi officials now acknowledge that Khashoggi was killed inside the consulate on Oct. 2, all they have said about his body is that the assailants gave it to a “local collaborator” for disposal.

The senior Turkish official said Turkish investigators do not believe such a figure exists.


Saudi Arabia’s top prosecutor, Saud al-Mojeb, leaves his country’s consulate in Istanbul on Oct. 30, 2018. (Can Erok/AP)

A second senior Turkish official said that Saudi Arabia’s top prosecutor, Saud al-Mojeb, who completed a three-day visit to Istanbul on Wednesday, did not provide the location of Khashoggi’s body or identify any “local collaborator.”

Since Mojeb arrived in Turkey on Monday, “Saudi officials seemed primarily interested in finding out what evidence the Turkish authorities had against the perpetrators,” the Turkish official said, speaking on the condition of anonymity to discuss private law enforcement contacts. “We did not get the impression that they were keen on genuinely cooperating with the investigation.”

Turkish prosecutor Irfan Fidan issued his public description of the killing shortly after Mojeb left Istanbul, amid mounting Turkish complaints about a lack of Saudi cooperation.

Fidan said Khashoggi was “strangled as soon as he entered the consulate” in line with “premeditated plans.” The body, “after being strangled, was subsequently destroyed by being dismembered, once again confirming the planning of the murder,” Fidan said.

The Turkish statement used the word “bogulmak,” which can also mean suffocation.

Turkish officials say members of a 15-man hit team dispatched from Saudi Arabia killed Khashoggi inside the consulate before flying out of Turkey later the same day. The Turkish government says it has an audio recording of what transpired inside the mission. Although Turkish officials have played the audio for CIA officials, including Director Gina Haspel, Turkish officials have not released the audio to the public.

Saudi Arabia has provided shifting explanations about what happened to Khashoggi, a Saudi citizen, contributing columnist to The Washington Post and critic of the Saudi leadership, including the de facto Saudi ruler, Crown Prince Mohammed bin Salman. For more than two weeks, Saudi authorities repeatedly denied any knowledge of Khashoggi’s whereabouts, then abruptly changed their account, blaming the killing on agents acting outside the Saudi government’s authority.

Turkish investigators initially focused their search for Khashoggi’s body in two wooded areas outside Istanbul, guided in part by surveillance footage that Turkish authorities said showed Saudi diplomatic vehicles apparently scouting Belgrad Forest the night before the journalist was killed.

Last week, investigators suspended the search, focusing instead on the consulate’s grounds and the consul general’s residence. The search focused in particular on a well on consular property, where the assailants could have disposed of Khashoggi’s dissolved remains, the first senior Turkish official said.

Investigators last week also inspected the sewer system near the consulate, according to Turkey’s state-run Anadolu news agency.

Turkish officials, including President Recep Tayyip Erdogan, have repeatedly complained that Saudi Arabia is hampering the investigation by refusing to provide critical pieces of information, including the location of Khashoggi’s body. Turkey has also requested the extradition of 18 suspects who the Saudi government says have been arrested in the case. 

Saudi Foreign Minister Adel al-Jubeir said the suspects will be tried in domestic courts.

On Wednesday, a Saudi official said the kingdom had not officially concluded that Khashoggi’s death was premeditated. “The public prosecutor has acknowledged seeing that information from the Turkish side. We have not said if that is true or not true. We are waiting for the results of the investigation,” the official said, speaking on the condition of anonymity because he was not permitted to speak to the media. 

The journalist’s death and the inconsistent Saudi explanations of his killing have unleashed a storm of international criticism, placing President Trump in a difficult situation. In addition to being a major purchaser of American weapons, Saudi Arabia sits at the heart of the administration’s strategy in the Middle East, in particular U.S. efforts to counter what Washington says are Iran’s expansionist policies.

Trump has said he is “not satisfied” with the Saudi explanations of Khashoggi’s death. Defense Secretary Jim Mattis has warned that the crisis could affect regional stability. But there are few indications that Khashoggi’s death will fundamentally alter the relationship between the two nations. 

On Wednesday, a group of Republican senators called on Trump to suspend negotiations for a U.S.-Saudi civil nuclear agreement.

They cited Khashoggi’s death, as well as Riyadh’s policies toward Lebanon and Yemen, as cause for “serious concerns about the transparency, accountability and judgment of current decision-makers.”

Although the Turkish announcement Wednesday appeared to partly illuminate what happened to Khashoggi, several central questions remain, including who ordered his killing and whether the crown prince was aware of the operation. While Riyadh has painted the killing as a rogue plot, Western officials say it is unlikely that something this complex could have been carried out without Mohammed’s knowledge.

French Foreign Minister Jean-Yves Le Drian said Wednesday that his government would take “necessary measures” against those responsible for the journalist’s death. 

“So long as those who are responsible and the circumstances around the killing are not made public, released and evaluated, we will go on demanding the truth,” he said.

Zeynep Karatas in Istanbul and Kevin Sullivan in Riyadh contributed to this report.

#####EOF##### The Washington Post: My Newsletters
#####EOF##### The Washington Post: My Profile
#####EOF##### Palantir wins competition to build Army intelligence system - The Washington Post

Palantir wins competition to build Army intelligence system


The contract awarded to Palantir Technologies is potentially worth more than $800 million. (Staff/AFP/Getty Images)

The Army has chosen Palantir Technologies to deploy a complex battlefield intelligence system for soldiers, according to Army documents, a significant boost for a company that has attracted a devoted following in national security circles but had struggled to win a major defense contract.

Industry experts said it marked the first time that the government had tapped a Silicon Valley software company, as opposed to a traditional military contractor, to lead a defense program of record, which has a dedicated line of funding from Congress. The contract is potentially worth more than $800 million.

The Army’s decision to go with Palantir, which was co-founded by Peter Thiel, the billionaire investor and sometimes adviser to President Trump, brings to a close the latest chapter in a fierce competition.

In March 2018, the Army chose Palantir and Raytheon to vie for the next phase of the Distributed Common Ground System (or DCGS-A, for Army), which lets users gather and analyze information about enemy movements, terrain and weather to create detailed maps and reports in real-time. The system is designed to be used by soldiers fighting in remote, harsh environments.

But critics within the Army and in Congress have for years complained that DCGS-A cost too much and didn’t deliver the intelligence and capabilities that soldiers needed. Some soldiers said the system was too hard to use and searched for alternatives.

Many became backers of Palantir, which sells to governments and businesses, including in the financial and health care sectors.

Palantir and its advocates argued that their software was cheaper and could meet all the Army’s requirements. But Army brass defended their decision to pay for a custom-built platform.

In 2016, Palantir successfully argued in court that the government was required by law to consider purchasing commercial products, when available, rather than custom ones.

That sent the Army back to the drawing board and led to the face off between Palantir and Raytheon.

Before his death. Sen. John McCain (R-Ariz.) praised the new approach on Twitter, noting that after the Army had already spent $3 billion in development costs, “it was time to find another way.”

Raytheon and Palantir were allowed to test their respective software platforms with a live audience of soldiers, who told them what they liked and didn’t and what they would change. The two companies then refined their offerings to suit the Army’s needs.

Traditionally, the government first chooses a company to build a system according to a set of detailed requirements. But this approach let the Army take both companies’ products for a test drive before settling on the winner.

“The Army changed its approach to acquisition,” Doug Philippone, a former Army Ranger who leads Palantir’s defense business, said in an interview.

He said the company was always confident it could win if it were allowed to adjust its technology after getting feedback from soldiers, who he said put the software through a rigorous test, even parachuting out of airplanes with reinforced laptops containing Palantir’s software.

Chris Johnson, a spokesman for Raytheon, said the company was disappointed in the outcome. “We will wait for the Army’s de-brief to understand their decision.”

The Army did not provide a comment for this story.

Raytheon and Palantir may compete for subsequent phases of work on the program.

Unlike most Silicon Valley start-ups, which aim to make their fortunes building consumer applications and software, Palantir at its founding set its sites on Washington, believing that its data analytics tools would find an eager market among U.S. spy agencies and the military, which are constantly trying to manage ever-expanding streams of information.

Philippone said the Army win had validated Palantir’s strategy.

“We founded the company around solving this particular mission,” he said.

The company faced initial skepticism from investors, who thought it couldn’t overcome entrenched bureaucratic interests and what they saw as political favoritism that led the Pentagon to spend billions every year with the same small group of Beltway contractors.

“Everyone told us we should stay away from Washington because it was corrupt and we didn’t know how to play golf with senators,” Joe Lonsdale, a Palantir co-founder, said in a 2011 interview.

The company got an early investment in 2005 from In-Q-Tel, the CIA’s venture capital arm, which tries to quickly develop technologies that the intelligence agency might use.

The In-Q-Tel connection helped Palantir get meetings with U.S. officials and intelligence analysts, and even test its software with the CIA’s counterterrorism center, according to people familiar with the matter.

#####EOF##### Forget Facebook and Twitter, fake news is even worse on WhatsApp — and it can be deadly - The Washington Post

On WhatsApp, fake news is fast — and can be fatal


Demonstrators in Guwahati, India, last month demand the arrest of those involved in the lynching deaths of two men sparked by rumors spread on Facebook and WhatsApp. (Biju Boro/AFP/Getty Images)

Americans associate misinformation with Facebook and the ways it shaped debate around the 2016 presidential election. But in other countries, falsities are just as likely to spread on private messaging services — sometimes with deadly consequences.

At least two dozen people have been killed in mob lynchings in India since the start of the year, their deaths fueled by rumors that spread on WhatsApp, the Facebook-owned messaging service. In Brazil, messages on WhatsApp falsely claimed a government-mandated yellow-fever vaccine was dangerous, leading people to avoid it. And as Mexico was heading into its presidential election this month, experts there called WhatsApp the ugly underbelly of the country’s news environment, a place where politically misleading stories, memes and messages can spread unchecked.

On WhatsApp, with 1.5 billion users, information can go viral in minutes as individuals forward messages along to their friends or groups, without any way to determine its origin.

Messaging platforms have hosted disinformation campaigns in at least 10 countries this year, according to a report by the Computational Propaganda Project at Oxford University. WhatsApp was the main platform for disinformation in seven of those nations, including Brazil, India, Pakistan, Zimbabwe and Mexico. Other messaging apps that have hosted disinformation include Telegram in Iran, WeChat in China and Line in Thailand.

“In the U.S., the disinformation debate is about the Facebook news feed, but globally, it’s all about closed messaging apps,” said Claire Wardle, executive director of First Draft, a nonprofit news literacy and fact-checking organization affiliated with Harvard University’s John F. Kennedy School of Government.

The closed nature of messaging services complicates the already difficult task of fighting rumors and stamping out lies. Unlike the largely open forums of Facebook and Twitter, WhatsApp hosts private chats among groups of friends. It is encrypted, or mathematically scrambled, so that no one — not even the service’s employees — can read the content of messages that were not intended for them.

“In many countries, messaging services are the main platform to get online,” said Samantha Bradshaw, co-author of the report from the Computational Propaganda Project. “The closed platforms can be more dangerous because the information is spreading in these intimate groups of friends and family — people we tend to trust.”

A group of friends picnicking in southern India this month stopped their car to offer some local children chocolate. It proved to be a deadly mistake — rumors quickly spread on WhatsApp that they were child kidnappers.

A virulent mob gathered in response to the messages. In the end, one of the picnickers, a software engineer named Mohammed Azam Ahmed, 32, lay dead.

“They kept pleading, but nobody listened to them,” said the victim’s brother, Mohammed Akram. “My brother was killed by fake news.”

Now WhatsApp, under pressure from political leaders and spurred by new leadership, is taking steps to root out misinformation. Executives held urgent meetings with political leaders in India last week, and the service is building new technology to promote news literacy. On Thursday, the company announced a major change, limiting the ability to forward messages — a feature that has been blamed for enabling disinformation to go viral.

WhatsApp’s new boss — Chris Daniels, a veteran executive from WhatsApp’s parent company, Facebook — has vowed to prioritize safety. Daniels is playing catch-up with Facebook, which since the 2016 U.S. election has poured immense resources into combating viral fake news and other malicious content promoted by profiteers, ideologues and Russian operatives — an effort with mixed results. The fix at WhatsApp is even harder because the chat app was designed to be a black hole.

The app’s encryption makes it impossible for WhatsApp’s security staff to read messages unless a user specifically reports them as problematic. And because WhatsApp lets people sign up with just a phone number — unlike Facebook, WhatApp does not require users to have an email address or reveal their real name — engineers have limited visibility into users’ friends or into what they’ve posted in the past, cutting them off from key clues to malicious behavior. WhatsApp says the average group size is six, but it allows groups of up to 256.

Conversations on these platforms are less visible to outsiders, journalists and fact-checkers who often debunk misinformation. In Colombia, Mexico and Brazil, news and fact-checking organizations have recently set up WhatsApp hotlines where people can forward along questionable content to be debunked. The organizations then return the correct story to the person who sent it and hope that person shares it with their groups.

For months leading up to the Mexican election, an edited video with accompanying text began circulating on WhatsApp. The grainy footage showed a man being burned alive in the state of Tabasco, while a crowd shouted “Morena” — the political movement of the front-runner, Andrés Manuel López Obrador, who won the election — in the background, implying that the man was being tortured for his political beliefs. Accompanying text blamed Obrador supporters, using a derogatory term for leftists. Many people forwarded it to the Mexican fact-checking hotline, asking whether it was real.

Journalists and fact-checkers who reviewed the video said that the man was attacked because he was stealing a motorcycle, not for his political beliefs, and that the full unedited version showed the crowd shouting the names of different political candidates. The fact-checking group published an article with the facts as they understood them — citing local news reports — and sent it back to all the people who forwarded the original falsehood, asking them to spread it in their groups. They said they recognized the limitations of the strategy.

In Brazil, a nationwide strike by truck drivers in May was organized on WhatsApp, said Daniel Bramatti, head of Abraji, Brazil’s association of investigative journalism, and an organizer of a WhatsApp hotline to vet news that may be fake. False stories have spread about political candidates and about the dangers of many vaccines, he said — the yellow-fever story reached so many people that the federal government issued an official warning debunking it. Since the hotline was created, 17,000 messages have been forwarded to the group, Bramatti said.

WhatsApp is giving software tools to Brazilian news organizations that will allow them to send a link to a fact-checked story to large numbers of users at once — allowing them to debunk fake news en masse — ahead of the country’s presidential election in October.

Brazil, which has about 120 million WhatsApp users, has shut down WhatsApp three times, most recently in 2016, over a fight between the site and the government, which wanted data about malicious actors and criminals.

But the dark side of misinformation on messaging services is most apparent in India, where more than 225 million people use it, according to the Indian government, a total quickly gaining on the estimated 240 million who use Facebook.

There, the combination of an inexperienced and digitally illiterate user base, coupled with WhatsApp’s encryption, has proved to be toxic, leading to fear, misunderstanding and, in some cases, violence.

Beyond the July 13 incident that claimed the life of Ahmed, two dozen others have died in recent weeks from lynch mobs sparked by rumors that spread on WhatsApp of child-kidnapping rings or organ-harvesting gangs, authorities say.

The violence has prompted an angry warning from the Indian government. Last week, the government called on WhatsApp to do more to address accountability and “traceability” in the app to stem the tide of fake news — or face legal action.

Many police departments and municipalities have created their own grass-roots response to the crisis, including public education campaigns with street theater, or “town criers” going from village to village with loudspeakers on top of fans, warning citizens not to believe fake news.

But some Internet experts say that the Indian government has been slow to respond to the growing problem, in part because their political parties are hardcore users of the platform and often use it to send out false or misleading information themselves.

In a recent regional election in India that was seen as a prelude to the country’s 2019 national election, WhatsApp researchers found that one political party — which they did not name — used the platform inappropriately, with party loyalists setting up thousands of WhatsApp groups and in some cases successfully spamming users with near-constant political messages.

The company was able to catch and block some of the accounts, but many slipped through their fingers.

The misuse of WhatsApp mirrors the way in which other tech tools have been weaponized in recent years, in particular around misinformation. The company’s roots are about getting as little information about users as possible, and WhatsApp’s founders, Brian Acton and Jan Koum, were libertarians who believed deeply in privacy.

After Facebook acquired WhatsApp for $19 billion in 2014, its largest acquisition ever, the messaging company operated separately from its parent and was divorced from Facebook’s efforts to combat misinformation, such as hiring thousands of moderators and building artificial-intelligence software to spot malicious posts. Researchers said that independence allowed problems to fester, undermining Facebook’s corporate mission to promote democracy around the world.

“Our focus has always been on helping people stay safe and maintaining private communication on WhatsApp,” said WhatsApp spokesman Carl Woog. “We recognize the severe consequences that can come from viral misinformation, and we’re working with others to address this challenge.”

Acton and Koum fought frequently with Facebook over user privacy, access to data and how to make WhatsApp turn a profit, according to two people familiar with the debates. Acton left late last year. Koum announced his resignation this year after a Washington Post report revealed he planned to leave over broad clashes with Facebook.

“They were fierce when it came to data privacy, and they were fiercely independent,” said Kevin Lee, who was a global manager for spam operations at Facebook through 2016.

WhatsApp does not see itself as a social-media service, because content is not posted publicly and algorithms do not spread information virally. But even without algorithms, WhatsApp’s ability to forward messages has turned it into a hybrid. Its leadership was previously opposed to any efforts to intervene in users’ ability to send messages.

The shutdown in Brazil and growing commercial spam problems led to a crisis within WhatsApp, and Facebook began to clamp down, according to the two people familiar with the matter. Facebook sent engineers and pushed WhatsApp to hire policy experts for the first time, doubling the company’s size and moving WhatsApp’s headquarters to Menlo Park, Calif., where Facebook is based.

Last week the company started training Indian nonprofits on how to spot fake news and “to think before you share,” Woog said. WhatsApp also ran full-page newspaper ads in India that included 10 tips on how to recognize false information, including “check information that seems unbelievable” and “use other sources.”

The company is hiring engineers to specifically focus on disinformation in elections, and it is building in new technology that will indicate whether a message has been forwarded, an indicator that the person who sent the message did not actually write the story or produce the content in question.

On his first week on the job in May, Daniels, who declined to be interviewed, assembled WhatsApp’s 300 employees for a town hall. The commitment to privacy would not change, he told them, but from now on the focus of the service would also include safety — preventing misinformation and the harm it can cause, according to an executive who attended the meeting.

Gowen reported from New Delhi. Farheen Fatima in New Delhi contributed to this report.

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### Subscribe to The Washington Post
#####EOF##### Mark Zuckerberg: The Internet needs new rules. Let’s start in these four areas. - The Washington Post

Mark Zuckerberg: The Internet needs new rules. Let’s start in these four areas.


Facebook, Messenger and Instagram apps on an iPhone screen. (Jenny Kane/AP)

Mark Zuckerberg is founder and chief executive of Facebook.

Technology is a major part of our lives, and companies such as Facebook have immense responsibilities. Every day, we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks. These are important for keeping our community safe. But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone.

I believe we need a more active role for governments and regulators. By updating the rules for the Internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms.

From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.

First, harmful content. Facebook gives everyone a way to use their voice, and that creates real benefits — from sharing experiences to growing movements. As part of this, we have a responsibility to keep people safe on our services. That means deciding what counts as terrorist propaganda, hate speech and more. We continually review our policies with experts, but at our scale we’ll always make mistakes and decisions that people disagree with.

Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own. So we’re creating an independent body so people can appeal our decisions. We’re also working with governments, including French officials, on ensuring the effectiveness of content review systems.

Internet companies should be accountable for enforcing standards on harmful content. It’s impossible to remove all harmful content from the Internet, but when people use dozens of different sharing services — all with their own policies and processes — we need a more standardized approach.

One idea is for third-party bodies to set standards governing the distribution of harmful content and to measure companies against those standards. Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.

Facebook already publishes transparency reports on how effectively we’re removing harmful content. I believe every major Internet service should do this quarterly, because it’s just as important as financial reporting. Once we understand the prevalence of harmful content, we can see which companies are improving and where we should set the baselines.

Second, legislation is important for protecting elections. Facebook has already made significant changes around political ads: Advertisers in many countries must verify their identities before purchasing political ads. We built a searchable archive that shows who pays for ads, what other ads they ran and what audiences saw the ads. However, deciding whether an ad is political isn’t always straightforward. Our systems would be more effective if regulation created common standards for verifying political actors.

Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference. Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting. We believe legislation should be updated to reflect the reality of the threats and set standards for the whole industry.

Third, effective privacy and data protection needs a globally harmonized framework. People around the world have called for comprehensive privacy regulation in line with the European Union’s General Data Protection Regulation, and I agree. I believe it would be good for the Internet if more countries adopted regulation such as GDPR as a common framework.

New privacy regulation in the United States and around the world should build on the protections GDPR provides. It should protect your right to choose how your information is used — while enabling companies to use information for safety purposes and to provide services. It shouldn’t require data to be stored locally, which would make it more vulnerable to unwarranted access. And it should establish a way to hold companies such as Facebook accountable by imposing sanctions when we make mistakes.

I also believe a common global framework — rather than regulation that varies significantly by country and state — will ensure that the Internet does not get fractured, entrepreneurs can build products that serve everyone, and everyone gets the same protections.

As lawmakers adopt new privacy regulations, I hope they can help answer some of the questions GDPR leaves open. We need clear rules on when information can be used to serve the public interest and how it should apply to new technologies such as artificial intelligence.

Finally, regulation should guarantee the principle of data portability. If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.

This is important for the Internet — and for creating services people want. It’s why we built our development platform. True data portability should look more like the way people use our platform to sign into an app than the existing ways you can download an archive of your information. But this requires clear rules about who’s responsible for protecting information when it moves between services.

This also needs common standards, which is why we support a standard data transfer format and the open source Data Transfer Project.

I believe Facebook has a responsibility to help address these issues, and I’m looking forward to discussing them with lawmakers around the world. We’ve built advanced systems for finding harmful content, stopping election interference and making ads more transparent. But people shouldn’t have to rely on individual companies addressing these issues by themselves. We should have a broader debate about what we want as a society and how regulation can help. These four areas are important, but, of course, there’s more to discuss.

The rules governing the Internet allowed a generation of entrepreneurs to build services that changed the world and created a lot of value in people’s lives. It’s time to update these rules to define clear responsibilities for people, companies and governments going forward.

#####EOF##### Alex Horton - The Washington Post

Alex Horton

Washington, D.C.

General assignment reporter covering national and breaking news Education: Georgetown University, BA in English
 Alex Horton is a general assignment reporter for The Washington Post. He previously covered the military and national security for Stars and Stripes, and served in Iraq as an Army infantryman. 
Latest from Alex Horton

"The. Border. Is. Not. A. War. Zone," a correspondent at MSNBC who has reported from the border said in response.

  • Apr 4, 2019

L’Daijohnique Lee's felony charge was recalled by the Dallas County district attorney's office following protests.

  • Apr 3, 2019

Dominic Esquibel said a ranger injured his severely wounded leg so badly that he now needs to amputate his foot.

  • Apr 2, 2019

It promised a gigabyte’s worth of storage, 100 times more than what was standard on April 1, 2004. And it would be searchable, like Google! Was this some kind of joke?

  • Apr 1, 2019

High-profile police killings caught on video often point to a probable legal outcome. But the images and the outrage they engender are still outpaced by a justice system that critics say favors the police.

  • Mar 30, 2019

Powerball and Mega Millions changed the odds to create big jackpot after big jackpot.

  • Mar 28, 2019

"It’s inexcusable," West Virginia Gov. Jim Justice (R) said after learning details of the incident, which led to firings and one indictment.

  • Mar 22, 2019

The lawmaker swatted down one theory suggested by a Fox News contributor: "I’m not a Manchurian Candidate."

  • Mar 22, 2019

“Oh my God. It’s a meteor. It’s a fireball. What the hell?” one exasperated woman said in a video.

  • Mar 21, 2019

Emmy, Grammy, Oscar, Tony Award. John Legend just became the second youngest to win all four.

  • Sep 10, 2018
Load More
#####EOF##### NSA collects millions of e-mail address books globally - The Washington Post

NSA collects millions of e-mail address books globally

Correction: An earlier version of this story incorrectly said that the National Security Agency's Australian counterpart assisted the United States in the collection of contact lists from personal e-mail and instant messaging accounts. The assistance was provided by NSA's counterpart in Britain, the Government Communications Headquarters.

In June, President Obama said the NSA’s email collecting program “does not apply to U.S. citizens.” (Thomas LeGro/The Washington Post)

The National Security Agency is harvesting hundreds of millions of contact lists from personal e-mail and instant messaging accounts around the world, many of them belonging to Americans, according to senior intelligence officials and top-secret documents provided by former NSA contractor Edward Snowden.

The collection program, which has not been disclosed before, intercepts e-mail address books and “buddy lists” from instant messaging services as they move across global data links. Online services often transmit those contacts when a user logs on, composes a message, or synchronizes a computer or mobile device with information stored on remote servers.

Rather than targeting individual users, the NSA is gathering contact lists in large numbers that amount to a sizable fraction of the world’s e-mail and instant messaging accounts. Analysis of that data enables the agency to search for hidden connections and to map relationships within a much smaller universe of foreign intelligence targets.

During a single day last year, the NSA’s Special Source Operations branch collected 444,743 e-mail address books from Yahoo, 105,068 from Hotmail, 82,857 from Facebook, 33,697 from Gmail and 22,881 from unspecified other providers, according to an internal NSA PowerPoint presentation. Those figures, described as a typical daily intake in the document, correspond to a rate of more than 250 million a year.

Each day, the presentation said, the NSA collects contacts from an estimated 500,000 buddy lists on live-chat services as well as from the inbox displays of Web-based e-mail accounts.

Read the documents

The NSA's problem? Too much data.

Read select pages from an NSA briefing on problems with high-volume, low-value collection of e-mail address books and buddy lists.

SCISSORS: How the NSA collects less

An NSA presentation on the SCISSORS tool that helps the agency cut out data it does not need.

An excerpt from the NSA's Wikipedia

An article from "Intellipedia," the NSA's classified wiki, on the problem of overcollection of data from Internet contact lists.

The collection depends on secret arrangements with foreign telecommunications companies or allied intelligence services in control of facilities that direct traffic along the Internet’s main data routes.

Although the collection takes place overseas, two senior U.S. intelligence officials acknowledged that it sweeps in the contacts of many Americans. They declined to offer an estimate but did not dispute that the number is likely to be in the millions or tens of millions.

A spokesman for the Office of the Director of National Intelligence, which oversees the NSA, said the agency “is focused on discovering and developing intelligence about valid foreign intelligence targets like terrorists, human traffickers and drug smugglers. We are not interested in personal information about ordinary Americans.”

The spokesman, Shawn Turner, added that rules approved by the attorney general require the NSA to “minimize the acquisition, use and dissemination” of information that identifies a U.S. citizen or permanent resident.

The NSA’s collection of nearly all U.S. call records, under a separate program, has generated significant controversy since it was revealed in June. The NSA’s director, Gen. Keith B. Alexander, has defended “bulk” collection as an essential counterterrorism and foreign intelligence tool, saying, “You need the haystack to find the needle.”

Contact lists stored online provide the NSA with far richer sources of data than call records alone. Address books commonly include not only names and e-mail addresses, but also telephone numbers, street addresses, and business and family information. Inbox listings of e-mail accounts stored in the “cloud” sometimes contain content, such as the first few lines of a message.

Taken together, the data would enable the NSA, if permitted, to draw detailed maps of a person’s life, as told by personal, professional, political and religious connections. The picture can also be misleading, creating false “associations” with ex-spouses or people with whom an account holder has had no contact in many years.

The NSA has not been authorized by Congress or the special intelligence court that oversees foreign surveillance to collect contact lists in bulk, and senior intelligence officials said it would be illegal to do so from facilities in the United States. The agency avoids the restrictions in the Foreign Intelligence Surveillance Act by intercepting contact lists from access points “all over the world,” one official said, speaking on the condition of anonymity to discuss the classified program. “None of those are on U.S. territory.”

Because of the method employed, the agency is not legally required or technically able to restrict its intake to contact lists belonging to specified foreign intelligence targets, he said.

When information passes through “the overseas collection apparatus,” the official added, “the assumption is you’re not a U.S. person.”

In practice, data from Americans is collected in large volumes — in part because they live and work overseas, but also because data crosses international boundaries even when its American owners stay at home. Large technology companies, including Google and Facebook, maintain data centers around the world to balance loads on their servers and work around outages.

A senior U.S. intelligence official said the privacy of Americans is protected, despite mass collection, because “we have checks and balances built into our tools.”

NSA analysts, he said, may not search within the contacts database or distribute information from it unless they can “make the case that something in there is a valid foreign intelligence target in and of itself.”

In this program, the NSA is obliged to make that case only to itself or others in the executive branch. With few exceptions, intelligence operations overseas fall solely within the president’s legal purview. The Foreign Intelligence Surveillance Act, enacted in 1978, imposes restrictions only on electronic surveillance that targets Americans or takes place on U.S. territory.

By contrast, the NSA draws on authority in the Patriot Act for its bulk collection of domestic phone records, and it gathers online records from U.S. Internet companies, in a program known as PRISM, under powers granted by Congress in the FISA Amendments Act. Those operations are overseen by the Foreign Intelligence Surveillance Court.

Sen. Dianne Feinstein, the California Democrat who chairs the Senate Intelligence Committee, said in August that the committee has less information about, and conducts less oversight of, intelligence gathering that relies solely on presidential authority. She said she planned to ask for more briefings on those programs.

“In general, the committee is far less aware of operations conducted under 12333,” said a senior committee staff member, referring to Executive Order 12333, which defines the basic powers and responsibilities of the intelligence agencies. “I believe the NSA would answer questions if we asked them, and if we knew to ask them, but it would not routinely report these things, and, in general, they would not fall within the focus of the committee.”

Because the agency captures contact lists “on the fly” as they cross major Internet switches, rather than “at rest” on computer servers, the NSA has no need to notify the U.S. companies that host the information or to ask for help from them.

“We have neither knowledge of nor participation in this mass collection of web-mail addresses or chat lists by the government,” said Google spokeswoman Niki Fenwick.

At Microsoft, spokeswoman Nicole Miller said the company “does not provide any government with direct or unfettered access to our customers’ data,” adding that “we would have significant concerns if these allegations about government actions are true.”

Facebook spokeswoman Jodi Seth said that “we did not know and did not assist” in the NSA’s interception of contact lists.

It is unclear why the NSA collects more than twice as many address books from Yahoo than the other big services combined. One possibility is that Yahoo, unlike other service providers, has left connections to its users unencrypted by default.

Suzanne Philion, a Yahoo spokeswoman, said Monday in response to an inquiry from The Washington Post that, beginning in January, Yahoo would begin encrypting all its e-mail connections.

Google was the first to secure all its e-mail connections, turning on “SSL encryption” globally in 2010. People with inside knowledge said the move was intended in part to thwart large-scale collection of its users’ information by the NSA and other intelligence agencies.

The volume of NSA contacts collection is so high that it has occasionally threatened to overwhelm storage repositories, forcing the agency to halt its intake with “emergency detasking” orders. Three NSA documents describe short-term efforts to build an “across-the-board technology throttle for truly heinous data” and longer-term efforts to filter out information that the NSA does not need.

Spam has proven to be a significant problem for the NSA — clogging databases with information that holds no foreign intelligence value. The majority of all e-mails, one NSA document says, “are SPAM from ‘fake’ addresses and never ‘delivered’ to targets.”

In fall 2011, according to an NSA presentation, the Yahoo account of an Iranian target was “hacked by an unknown actor,” who used it to send spam. The Iranian had “a number of Yahoo groups in his/her contact list, some with many hundreds or thousands of members.”

The cascading effects of repeated spam messages, compounded by the automatic addition of the Iranian’s contacts to other people’s address books, led to a massive spike in the volume of traffic collected by the Britain intelligence service on the NSA’s behalf.

After nine days of data-
bombing, the Iranian’s contact book and contact books for several people within it were “emergency detasked.”

In a briefing from the NSA’s Large Access Exploitation working group, that example was used to illustrate the need to narrow the criteria for data interception. It called for a “shifting collection philosophy”: “Memorialize what you need” vs. “Order one of everything off the menu and eat what you want.”

Julie Tate contributed to this report. Soltani is an independent security researcher and consultant.

#####EOF##### Supreme Court agrees to hear ‘Carpenter v. United States,’ the Fourth Amendment historical cell-site case - The Washington Post
The Volokh Conspiracy

Supreme Court agrees to hear ‘Carpenter v. United States,’ the Fourth Amendment historical cell-site case

Contributor, The Volokh Conspiracy

There was enormously important Fourth Amendment news from the Supreme Court on Monday: The justices agreed to review the U.S. Court of Appeals for the 6th Circuit’s decision in Carpenter v. United States, one of the long-pending cases on whether the Fourth Amendment protects government access to historical cell-site records.

This is a momentous development, I think. It’s not an exaggeration to say that the future of surveillance law hinges on how the Supreme Court rules in the case. Let me say a bit about the case, the issues it will decide and why it matters.

I. The Facts of the Case

Carpenter involves a string of armed robberies that occurred over a two-year period. A group of men (at least five of them) would go into cellphone stores armed with guns, order the customers and employees to the back, and steal the phones. Carpenter was the lead organizer of the conspiracy, and he often supplied the guns, acted as a lookout and would signal when each robbery was to begin.

One of Carpenter’s conspirators confessed to the crime and gave the government his cellphone number and the numbers of the other conspirators (16 numbers total). The government applied for three different court orders for the cell-site records associated with those numbers, which included Carpenter’s number. Specifically, the orders sought “cell site information” for Carpenter’s phone “at call origination and at call termination for incoming and outgoing calls.” The government obtained the orders under the Stored Communications Act. They complied with the statute, but the statute requires only reasonable suspicion and not probable cause.

The order that covered Carpenter was directed at his cellphone provider MetroPCS. MetroPCS produced 127 days of historical cell-site records. (Sprint produced another seven days of historical cell-site records for Carpenter’s phone from a time window when he was “roaming” and Sprint picked up his service instead of MetroPCS.) Together with the orders obtained, the records showed that that the phones of the alleged conspirators were within distances ranging from a half-mile up to two miles of the robberies at the time they occurred. Specifically, Carpenter’s phone was shown to be in communication with cell towers near four robberies over a five-month window.

II. The Legal Issues

Here is how counsel for the petitioner framed the “question presented”:

Whether the warrantless seizure and search of historical cell phone records revealing the location and movements of a cell phone user over the course of 127 days is permitted by the Fourth Amendment.

And here’s how the United States redrafted the question presented in its brief in opposition:

Whether the government’s acquisition, pursuant to a court order issued under 18 U.S.C. 2703(d), of historical cell-site records created and maintained by a cellular-service provider violates the Fourth Amendment rights of the individual customer to whom the records pertain.

I gather, then, that the case will consider two distinct questions. First, is the collection of the records a Fourth Amendment search? And second, if it is a search, is it a search that requires a warrant?

Notably, neither side sought review of whether the good-faith exception applies if the answer to both of these questions is “yes.” The parties are asking only for a ruling on the merits, with any remedies decision bifurcated for review on remand if the Supreme Court reverses.

III. Why The Case Matters

The Carpenter case is tremendously important, I think. The structure of modern surveillance law is built on the idea that the contents of communications receive Fourth Amendment protection but that non-content metadata — records about communications, and other third-party business records — do not. That has been the rule since the 19th century for postal letters, and it has been the rule since 1979 for phone calls. Carpenter will help determine if that basic rule framework will remain, or if the Supreme Court will amend it somewhat or even dramatically change it.

Part of the importance of the case is that it’s not just about cell-site records. Although the case is formally about cell-site records, it’s really about where to draw lines in terms of what network surveillance triggers the Fourth Amendment and how the Fourth Amendment applies. The justices can’t answer how the Fourth Amendment applies to cell-site records without providing a framework for how the Fourth Amendment applies to many other forms of surveillance, such as visual surveillance, obtaining traditional phone records, obtaining e-mail transactional records, obtaining credit card records and the like.

For example, readers will recall the debate over the mosaic theory of the Fourth Amendment. Among the issues likely to be pressed in Carpenter is whether the justices should adopt or reject the mosaic theory. Note that the question presented focuses on the fact that the records covered 172 days. Should the length covered by the records matter? Is evidence collection for a short time window no search that becomes a search because the records spanned a long time window?

Plus, remember that the justices will have two questions: what a search is, and when searches are reasonable. Most will focus on the first question, but note that the two issues go together. As I explained here, the broader the court interprets “search,” the more pressure there is to water down reasonableness. The narrower the definition of “search,” the stronger the reasonableness standard tends to be. This creates some interesting dynamics. For example, you might get a ruling that there is no search but that retains the traditional default warrant rule for searches. On the other hand, you might get a ruling that a search occurred but that authorizes a new category of warrantless surveillance. This is just speculation, of course, but I suspect the briefing will urge major doctrinal innovations on both questions.

IV. Why Did the Justices Take the Case?

Some will speculate that the Supreme Court would have taken the case only if it were going to reverse. I have no idea how the court will rule, but I tend to doubt that. If I had to guess, I would guess that the court took these cases because they’re really important. The lower court rulings are based on the third-party doctrine, and none of the current justices were on the court the last time the justices decided a case on the third-party doctrine. It’s pretty sensible to have the current Supreme Court weigh in.

As it happens, I think the third-party doctrine is essential to technological surveillance in a digital age. As I see it, the doctrine is needed to maintain the essential balance on which Fourth Amendment law has been built and on which it evolves in response to new technology. Prominent alternatives, like the mosaic theory, strike me as a dead end. But it makes a lot of sense for the justices to review these cases and decide whether they agree — and if not, identify what new framework should replace it.

V. Lots of Blogging Ahead

Finally, I’ll probably be doing a lot of carpentry (that is, blogging about the issues raised in Carpenter) over the next few months. A lot of my academic work in the past decade has been about issues that touch on the case, so it will be really fun to see what the justices do.

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read National
Read content from allstate
Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
We went to the source. Here’s what matters to millennials.
A state-by-state look at where Generation Y stands on the big issues.
#####EOF##### Coast Guard lieutenant used work computers in alleged planning of widespread domestic terrorist attack, prosecutors say - The Washington Post

Coast Guard lieutenant used work computers in alleged planning of widespread domestic terrorist attack, prosecutors say

The U.S. Coast Guard lieutenant spent hours on end planning a wide-scale domestic terrorist attack, even logging in at his work computer on the job at headquarters to study the manifestos and heinous paths of mass shooters, prosecutors say. He researched how to carry out sniper attacks, they contend, and whether rifle scopes were illegal. And all the while, investigators assert, he was amassing a cache of weapons as he ruminated about attacks on politicians and journalists.

But Christopher P. Hasson was not an isolated figure, according to a contractor who worked with him. The 49-year-old lieutenant with more than two decades in the Coast Guard was part of a project to replace some aging cutters in the fleet, tasks that regularly required interacting with civilians and military officials at meetings and on travel.

“I don’t remember him saying anything that was crazy,” said Adam Stolzberg, a contractor who worked at headquarters and was in meetings with Hasson a couple of times a month. Politics never came up, Stolzberg said.

It was only after Hasson’s arrest last Friday at his workplace that the chilling plans prosecutors assert he was crafting became apparent, detected by an internal Coast Guard program that watches for any “insider threat.”

The program identified suspicious computer activity tied to Hasson, prompting the agency’s investigative service to launch an investigation last fall, said Lt. Cmdr. Scott McBride, a service spokesman.

Hasson was arrested on gun and drug charges after officials with the Coast Guard Investigative Service and agents with the FBI in Baltimore began probing activities that prosecutors said in court were linked to what they described as Hasson’s white-
nationalist views. Federal law enforcement officials seized a stockpile of guns and ammunition from his basement apartment in the Maryland suburbs near Washington in the far east side of Silver Spring in Montgomery County.


Federal investigators allege Christopher P. Hasson had a cache of guns stockpiled to launch a terrorist attack targeting liberal politicians and journalists. (U.S. Attorney’s Office in Maryland)

“The sheer number and force of the weapons recovered from Mr. Hasson’s residence in this case, coupled with the disturbing nature of his writings, appear to reflect a very significant threat to the safety of our community, particularly given the position of trust that Mr. Hasson held with the United States government,” U.S. Attorney for the District of Maryland Robert K. Hur said Thursday after a hearing in which Hasson was ordered detained.

Prosecutors and Hasson’s federal public defender sparred over whether it was appropriate to jail him after an arrest for gun and drug charges but no terrorism-
related counts.

The judge, Charles B. Day, said that it is unusual to detain a defendant based on the charges Hasson was facing and that the issue at hand is “all about the defendant’s state of mind and intentions.”

Hasson’s federal public defender, Julie Stelzig, said the government’s court filings are a “histrionic characterization of Mr. Hasson” and there was “no actual indication of any plan.” She said that Hasson had no prior record and that the number of weapons he had were “modest at best” for average gun collectors.

“It’s not a crime to think negative thoughts,” Stelzig said of the writings the government points to as evidence of his extremist views. “It’s not a crime to think about doomsday scenarios.”

But with Hasson in court, prosecutors called him a “domestic terrorist” who intended to “murder innocent civilians.”

“What drives the government’s concern is what also gives the court pause,” Day said before he gave the government 14 days to bring additional charges and before Hasson’s lawyer could file an appeal for his possible release.

Hasson called for “focused violence” to “establish a white homeland,” prosecutors said in court filings. It’s unclear whether Hasson had a specific date for an attack, but the government said he had been stockpiling weapons for at least two years, spending $14,000 a year on equipment to ready for an attack.

As he built an arsenal, prosecutors contend, Hasson read manifestos by the Unabomber, the Virginia Tech shooter and the Olympic Park bomber among other domestic mass shooters, and also looked for guidance to the plot of right-wing terrorist Anders Behring Breivik, who in 2011 unleashed two attacks in Norway that killed 77 people.

“I am dreaming of a way to kill almost every last person on the earth,” Hasson said in one of his letters that contemplated creating a biological plague, according to records filed in U.S. District Court in Maryland.

During the raid this month, law enforcement officers seized 15 firearms and more than 1,000 rounds of ammunition from what they called his “cramped basement apartment” that was at the line near Prince George’s County.

Hasson, who had a bald head and was wearing a pink prison uniform, did not speak in court.

No one answered the door at the residential address that appeared to be associated with Hasson. Law enforcement officials in Montgomery County also said that while there were calls about loud parties in the area in recent years, there were no calls for service that would indicate anything was amiss at that residence.

Hasson joined the Coast Guard in March 1996 as an enlisted electronics technician and was promoted to chief warrant officer in 2012 and lieutenant in 2015, McBride said. He will remain on active duty until the legal case against him is adjudicated but has stopped working since his arrest.

Hasson was arrested once the FBI and Coast Guard investigators were “confident in the strength of the evidence supporting the criminal complaint and warrant,” McBride said.

As recently as Jan. 17, Hasson created a list of “traitors” and targets in a spreadsheet while reviewing various broadcast news sites from his work computer, court filings show. The list included people prosecutors believe to be Sen. Kamala D. Harris (D-Calif.), CNN reporter Don Lemon and nearly two dozen others.

“Unlawful possession of drugs and firearms, as well as advocacy for supremacist doctrine, ideology, or causes, violates Coast Guard policy, the Uniform Code of Military Justice, and our organization core values,” McBride said in an email.

Hasson’s access to Coast Guard headquarters has been revoked. He held a secret-level security clearance beginning in April 2005, and background checks did not find information that merited denying it, McBride said. Secret clearance typically allows access to information that can cause serious damage to U.S. national security if disclosed. It is considered more significant than confidential access and less significant than top-secret access.

Yvonne Carlock, a Marine Corps spokeswoman, said Wednesday that Hasson joined the service in December 1988, serving as an F/A-18 aircraft mechanic. His last rank in that service was corporal.

Federal authorities said he left sometime in 1993.

In June 1994, Hasson moved over to the Virginia Army National Guard, becoming an infantryman with Alpha Company, 1st Battalion, 183rd Infantry Regiment, said Kurt Rauschenberg, a National Guard Bureau spokesman. His unit was based south of Richmond near the town of Petersburg.

In September 1995, Hasson switched to the Arizona Army National Guard and left about six months later, in March 1996, exiting with the same rank as when he joined.

Property records indicate Hasson moved frequently in his varied military career, including stints in Arizona, California and Virginia. In 2007, he bought a house in Currituck, N.C., just across the bay from the Outer Banks. Neighbors said Hasson lived in the house for several years with a woman they identified as his wife and at least one young child.

“It was very neat,” said Delena Ostrander, who owns the adjacent lot. “I never heard any complaints.”

Her stepfather, Stanley Maculewich, still lives on the short, unpaved lane and remembers Hasson as a big, gun-owning Coast Guardsman who commuted to work early each morning by motorcycle.

“He was a good-sized guy, but I had no problem with him,” Maculewich said. “He was shooting his gun out there one day. But when I asked him to stop because I had a daughter in the house who was sick, he said ‘Fine.’ And that was it, he stopped.”

Stolzberg, the contractor who also worked at Coast Guard headquarters with Hasson, said Hasson never raised any alarms at the office. Tall and muscular with a shaved or bald head, Hasson sometimes drove a Harley-Davidson motorcycle to work, Stolzberg said. But his black leather jacket didn’t bear any insignia, and his arm tattoos didn’t appear out of the ordinary. Nor did Hasson express any radical views, Stolzberg said.

“I didn’t have a hard time getting along with him,” he said. “I was trying to think back: What did I miss? Was there anything there?”

Dan Morse, Steve Hendrix, Jennifer Barrios, Julie Tate and Alice Crites contributed to this report.

Most Read Local
#####EOF##### Hackers used a fish tank to break into a Vegas casino. We’re all in trouble. - The Washington Post
Monkey Cage

Hackers used a fish tank to break into a Vegas casino. We’re all in trouble.


A child admires a flowerhorn cichlid fish. In Las Vegas, one casino’s Internet-connected fish tank proved to be a doorway for hackers. (John T. Greilick/Detroit News/AP)

Bruce Schneier’s new book, “Click Here to Kill Everybody,” explains the security risks of a new world of household devices connected to the Internet. I asked him what the risks are, why they are so serious and what their consequences are for politics.

HF: Technology has created a hyper-connected world. How does this lead to vulnerabilities?

BS:  As we connect more things to the Internet, they can affect each other. This is generally a goodness, but it leads to vulnerabilities in unexpected ways. First, vulnerabilities in one thing can affect another thing. We saw this last year when a major Vegas casino’s high-roller database was hacked through — and I am not making this up — its Internet-connected fish tank.

The second way hyper-connection leads to vulnerabilities is that individual things, when combined, can generate new vulnerabilities. That is, it is their interaction that creates the vulnerabilities, without any individual system being at fault.

The third way is that vulnerabilities can cascade catastrophically. We also saw this in 2016 when vulnerabilities in Internet-connected webcams and digital video recorders enabled attackers to build a massive cyberweapon that, through a series of steps, took dozens of popular websites offline.

HF: How are those vulnerabilities changing as more and more of our everyday devices become connected to the Internet?

BS: What’s new with everyday devices like appliances, cars, medical devices, thermostats, consumer goods, toys and so on is that they do things. They affect the world in a direct physical manner. We used to only be concerned about bits and bytes. Now the risks are against life and property.

This fundamentally changes our threat model and obsoletes a lot of the security assumptions we have been making for decades: assumptions about how authentication works, about software reliability and patching, and about the wisdom of an unregulated technology space.

HF: You argue that “everyone wants you to have security, except from them.” Why is this so?

BS: We’ve built an Internet where the predominant business model is surveillance capitalism. So companies like Google and Facebook want your data to be secure from hackers and governments, as long as they get to spy on everything you’re doing — because that’s how they make money. Similarly, governments are all for security, as long as they get to access your data when they want it. As long as there’s this alliance between the big Internet companies and governments to ensure we can all be spied on, we won’t get real security.

HF: Why do businesses not have the appropriate incentives to fix the problems they are creating?

BS: There’s the spying I just mentioned, but it also goes deeper than that. Security isn’t something the public can evaluate. Consumers can’t tell which router or refrigerator is secure, even if they were willing to pay more for that security, so it’s not something businesses can use to differentiate themselves. Even worse, the risks are long-term and theoretical, which makes businesses willing to skimp on security and hope for the best.

We’ve seen this before. It’s a rare exception for an industry in the past century to improve its security and safety without being forced to by government: automobiles, airplanes, pharmaceuticals, food and restaurants, consumer goods.

HF: What should government do, and how does it need to change in order to do it?

BS: This is a complicated question and one that I spend most of my book trying to answer. I recommend a cocktail of different government interventions. I propose both explicit security rules and more flexible security standards. I propose liabilities when companies are negligent. I propose new laws, new legal interpretations of existing laws, and new actions by federal agencies. I see a role for international regulatory bodies and treaties, because many of the risks are fundamentally global.

The hard part is recognizing that the risks are great enough to require immediate action. My fear is that it will take a catastrophe — crashing *all* the cars, or shutting down *all* the power plants — to galvanize governments into action, and that they’ll react with something hastily put together and ill-considered. Our choice is not between longer government regulation or no government regulation; it’s between smart government regulation and stupid government regulation.

This article is one in a series supported by the MacArthur Foundation Research Network on Opening Governance that seeks to work collaboratively to increase our understanding of how to design more effective and legitimate democratic institutions using new technologies and new methods. Neither the MacArthur Foundation nor the network is responsible for the article’s specific content. Other posts can be found here.

Most Read Politics
#####EOF##### Trump administration’s new Arctic defense strategy expected to zero in on concerns about China - The Washington Post

Trump administration’s new Arctic defense strategy expected to zero in on concerns about China

Beijing’s recent behavior in the Arctic has triggered some alarms in the Pentagon.


A submarine breaks through ice in the Beaufort Sea off Alaska's north coast. (U.S. Navy/AP)

The Trump administration is drafting a new Arctic defense strategy focusing heavily on competition with China, whose expansion around the world has drawn increasing scrutiny from senior U.S. officials.

The document will outline how the Pentagon “can best defend U.S. national interests and support security and stability in the Arctic,” said Johnny Michael, a Pentagon spokesman. It will do so, he said, within the framework of the Pentagon’s national defense strategy, which last year emphasized shifting the military away from counterterrorism operations to “great-power competition” with Russia and China.

Both the Defense Department and the White House’s National Security Council will be involved in writing the document, which was mandated by Congress and must be delivered to lawmakers by June.

The discussions come as U.S. defense officials increasingly turn their eyes north, noting how the receding polar sea ice is opening new paths for sea vessels. In recent months, the Pentagon sailed an aircraft carrier above the Arctic Circle for the first time in decades, added more fighter jets to Alaska and made plans to add Navy P-8s, submarine-hunting reconnaissance planes, in Iceland.

“We welcome any country to operate in the Arctic as long as that presence is in compliance with international norms and rules of behavior,” Michael said. “The United States and its Arctic ally and partner nations work together in numerous forums to address shared regional concerns including fisheries management, shipping safety and scientific research.”

It isn’t clear if the strategy will address why sea ice is melting, or simply focus on what U.S. officials should do to protect the region. While most climate scientists and U.S. intelligence agencies have identified climate change as a threat to national security, President Trump has questioned its existence. Defense officials have sought to avoid the controversy, accepting that ice is melting but leaving it to others to explain why.

“It’s kind of like, ‘We don’t want to talk about the science, but we know the conditions have changed,’” said Sherri Goodman, who studies Arctic issues and focused on environmental issues in the Clinton administration. “I definitely could see them taking that approach in these documents, too.”

U.S. officials have for years raised concerns about the Arctic becoming increasingly militarized as it becomes easier to navigate. Many of those questions focused on Russia, which has expanded its fleet of icebreaker ships to more than 40, reopened Cold War-era military bases in the region and said it will deploy new antiaircraft missiles on some of them.

But Russia has the largest Arctic border of any nation, making it a definite player in the region. It also has a history of cooperating with the United States on some Arctic issues, such as the search and rescue of mariners in distress.

China’s recent behavior in the Arctic has triggered some alarms in the Pentagon. Last year, it declared itself a “near-Arctic nation,” an effort to inject Beijing into Arctic discussions and defend its desire for a “Polar Silk Road,” in which Chinese goods would be delivered by sea from Asia to Europe.

Beijing also has offered to bankroll projects in the region with loans, such as three airports in Greenland that drew the concern of former defense secretary Jim Mattis because of their potential military applications. In that case, the Pentagon made the case to Denmark that it should fund the facilities, the Wall Street Journal first reported last month. Denmark eventually agreed to grant loans for the first two of them, with the Pentagon offering to fund undetermined airport infrastructure, defense officials said.

“Countries should be wary of piling on monumental debt, particularly ‘loan to own’ projects, that undermines their freedom of political action and sovereign choices,” Michael said. “Beijing’s lack of transparency in its polar research, expeditionary activities and approach to natural resource development is also of concern.”

Jim Townsend, a senior defense official during the Obama administration who focused on European and Arctic issues, said that “everyone is looking at China and how aggressive they are.”

“They have a lot to gain economically by having shipping times cut by having that northern passage,” he said.

The new review has been launched as the release of Arctic strategy documents in some of the services proceed at different paces.

Last year, Navy Secretary Richard V. Spencer told the Senate Armed Services Committee that changing conditions had prompted the service to launch a new review.

Outgoing Air Force Secretary Heather Wilson and Air Force Chief of Staff Gen. David L. Goldfein disclosed similar plans in a column they authored in January for Defense News.

Adm. Karl Schultz, the Coast Guard commandant, said in August that he expected his service’s strategy could be released by the end of 2018, but it still has not been.

“There is no specific release date nor sequencing with the DOD Arctic Strategy,” said Navy spokesman Lt. Derrick Ingle. “The Navy is working with the drafters of the DOD Arctic Strategy and our goals are aligned.”

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read National
Read content from allstate
Content from Allstate This content is paid for by an advertiser and published by WP BrandStudio. The Washington Post newsroom was not involved in the creation of this content. Learn more about WP BrandStudio.
We went to the source. Here’s what matters to millennials.
A state-by-state look at where Generation Y stands on the big issues.
#####EOF##### Fred Barbash - The Washington Post

Fred Barbash

Washington, D.C.

Law, constitution and courts Education: University of Wisconsin at Madison
 Fred Barbash has been with The Washington Post for 30 years in a multitude of roles including but not limited to Supreme Court reporter, National editor, London bureau chief and founding editor of The Post's Morning Mix. He has covered all three branches of government and courts on every level and has written widely on Constitutional history. He was born in Washington, D.C. but raised and educated in Madison, Wisconsin, where his father was a professor of economics. Fred is married and has two great children.  
Latest from Fred Barbash

The House Judiciary Committee has subpoenaed the full Mueller report, but it’s unlikely to be able to enforce its will.

  • Apr 3, 2019

As the NRA cheered the forceful ruling, Eric Tirschwell of Everytown for Gun Safety blasted “the dangerous gun lobby view that more lethal firearms will make America safer.”

  • Apr 1, 2019

The administration says it won't defend the Affordable Care Act because it's unconstitutional. The Obama administration made a similar argument when it declined to defend the Defense of Marriage Act.

  • Mar 29, 2019

A constitutional clash over that issue could arise between House Democrats and the Justice Department.

  • Mar 24, 2019

The legislature’s post-election special session sought to strip authority from the Democratic governor-elect.

  • Mar 21, 2019

Judges across the country have ruled against the administration more frequently than usual

  • Mar 19, 2019

Federal judges have ruled against the Trump administration at least 63 times, often agreeing with plaintiffs that agency decision-making is arbitrary and capricious.

  • Mar 19, 2019

Federal judges have ruled against the Trump administration at least 63 times, often agreeing with plaintiffs that agency decision-making is arbitrary and capricious.

  • Mar 19, 2019

Connecticut’s Supreme Court ruled that the company can be sued over how it marketed the Bushmaster rifle, which was used to kill 20 children and six educators in 2012.

  • Mar 14, 2019

The attorney general blames “judicial activism” for administration setbacks in the courts. But the record suggests otherwise.

  • Oct 19, 2018
Load More
#####EOF##### Mark Zuckerberg: Protecting democracy is an arms race. Here’s how Facebook can help. - The Washington Post

Mark Zuckerberg: Protecting democracy is an arms race. Here’s how Facebook can help.


Facebook CEO Mark Zuckerberg testifies before a House committee in April. (Andrew Harnik/AP)

Mark Zuckerberg is chief executive officer of Facebook.

When you build services that connect billions of people across countries and cultures, you’re going to see all of the good that humanity can do, and you’re also going to see people try to abuse those services in every way possible. Our responsibility at Facebook is to amplify the good and mitigate the bad.

This is especially true when it comes to elections. Free and fair elections are the heart of every democracy. During the 2016 election, we were actively looking for traditional cyberattacks, and we found them. What we didn’t find until later were foreign actors running coordinated campaigns to interfere with America’s democratic process. Since then, we’ve focused on improving our defenses and making it much harder for anyone to interfere in elections.

Key to our efforts has been finding and removing fake accounts — the source of much of the abuse, including misinformation. Bad actors can use computers to generate these in bulk. But with advances in artificial intelligence, we now block millions of fake accounts every day as they are being created so they can’t be used to spread spam, false news or inauthentic ads.

Increased transparency in our advertising systems is another area where we have also made progress. You can now see all the ads an advertiser is running — even if they aren’t targeted to you. Anyone who wants to run political or issue ads in the United States on Facebook must verify their identity. All political and issue ads must also make clear who paid for them, in the same way as TV or newspaper advertisements. But we’ve gone even further by putting all these ads in a public archive, which anyone can search to see how much was spent on each individual ad and the audience it reached. This greater transparency will increase responsibility and accountability for advertisers.

As we’ve seen from previous elections, misinformation is a real challenge. A big part of the solution is getting rid of fake accounts. But it’s also about attacking the spammers’ economic incentives to create false news in the first place. And where posts are flagged as potentially false, we pass them to independent fact-checkers — such as the Associated Press and the Weekly Standard — to review, and we demote posts rated as false, which means they lose 80 percent of future traffic.

We’re not working alone. After 2016, it became clear that everyone — governments, tech companies and independent experts — needs to do a better job of sharing the signals and information they have to prevent this kind of abuse. These bad actors don’t restrict themselves to one service, and we shouldn’t approach the problem in silos, either. That’s why we’re working more closely with other technology companies on the cybersecurity threats we all face, and we’ve worked with law enforcement to take down accounts in Russia.

One of the biggest changes we’ve made over the past year is not to wait for reports of suspicious activity. Instead, we look proactively for potentially harmful election-related content, such as pages registered to a foreign entity that post divisive content to sow mistrust and drive people apart. When we find them, our security team manually reviews the accounts to see whether they violate our policies. If they do, we quickly remove them. For example, we recently took down a network of accounts in Brazil that was hiding its identity and spreading misinformation ahead of the country’s presidential elections in October.

For the U.S. midterm elections, we’re also using a new tool we tested in the Alabama Senate special election last year to identify political interference more quickly. This enabled us to find and remove foreign political spammers who’d previously flown under the radar. And last month, we took down hundreds of pages, groups and accounts for creating networks that were deliberately misleading people about their identities and intentions. Some originated in Iran and others in Russia.

I’m often asked how confident I feel about the midterms. We’ve made a lot of progress, as our work during the French, German, Mexican and Italian elections has shown. In each case, we identified and removed fake accounts and bad content leading up to the elections and, in Germany, we worked directly with the government to share information about potential threats. The investments we continue to make in people and technology will help us improve even further. But companies such as Facebook face sophisticated, well-funded adversaries who are getting smarter over time, too. It’s an arms race, and it will take the combined forces of the U.S. private and public sectors to protect America’s democracy from outside interference.

We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

#####EOF##### Policies and Standards - The Washington Post
About Us

Policies and Standards


(Photo by Bill O’Leary/The Washington Post)

 

Additional Policies and Information
Terms of Service, RSS Terms of Service, Privacy Policy, and Submissions and Discussion Policy

Mission statement

The mission of The Washington Post is defined in a set of principles written by Eugene Meyer, who bought the newspaper in 1933. Today they are displayed in brass linotype letters in an entrance to the newsroom. (His gender references have been supplanted by our policy of inclusion, but the values remain).

The Seven Principles for the Conduct of a Newspaper

  1. The first mission of a newspaper is to tell the truth as nearly as the truth may be ascertained.
  2. The newspaper shall tell ALL the truth so far as it can learn it, concerning the important affairs of America and the world.
  3. As a disseminator of the news, the paper shall observe the decencies that are obligatory upon a private gentleman.
  4. What it prints shall be fit reading for the young as well as for the old.
  5. The newspaper’s duty is to its readers and to the public at large, and not to the private interests of its owners.
  6. In the pursuit of truth, the newspaper shall be prepared to make sacrifices of its material fortunes, if such course be necessary for the public good.
  7. The newspaper shall not be the ally of any special interest, but shall be fair and free and wholesome in its outlook on public affairs and public men.

Eugene Meyer, March 5, 1935

Ethics policy

(This represents a synthesis of Washington Post policies and is not meant to be comprehensive).

These policies are meant to guide Washington Post journalism as we deliver news and information in a rapidly changing media environment. We consider these guidelines to be a “living document” that we will continually modify and update based on feedback from our journalists, from our readers, and from our perceptions of our changing needs. Because the circumstances under which information is obtained and reported vary widely from one case to the next, these guidelines should not be understood as establishing hard and fast rules or as covering every situation that might arise.

Conflict of Interest

This news organization is pledged to avoid conflicts of interest or the appearance of conflict of interest wherever and whenever possible. We have adopted stringent policies on these issues, conscious that they may be more restrictive than is customary in the world of private business. In particular:

We pay our own way.

We accept no gifts from news sources. We accept no free trips. We neither seek nor accept preferential treatment that might be rendered because of the positions we hold. Exceptions to the no-gift rule are few and obvious — invitations to meals, for example, may be accepted when they are occasional and innocent but not when they are repeated and their purpose is deliberately calculating. Free admissions to any event that is not free to the public are prohibited. The only exception is for seats not sold to the public, as in a press box, or tickets provided for a critic’s review. Whenever possible, arrangements will be made to pay for such seats.

We do not accept payment – either honoraria or expenses – from governments, government-funded organizations, groups of government officials, political groups or organizations that take positions on controversial issues. A reporter or editor also cannot accept payment from any person, company or organization that he or she covers. And we should avoid accepting money from individuals, companies, trade associations or organizations that lobby government or otherwise try to influence issues the newspaper covers. Broadcast organizations, educational institutions, social organizations and many professional organizations usually fall outside this provision unless the reporter or editor is involved in coverage of them.

It is important that no freelance assignments and no honoraria be accepted that might in any way be interpreted as disguised gratuities. We make every reasonable effort to be free of obligation to news sources and to special interests. We must be wary of entanglement with those whose positions render them likely to be subjects of journalistic interest and examination. Our private behavior as well as our professional behavior must not bring discredit to our profession or to The Post.

We avoid active involvement in any partisan causes — politics, community affairs, social action, demonstrations — that could compromise or seem to compromise our ability to report and edit fairly. Relatives cannot fairly be made subject to Post rules, but it should be recognized that their employment or their involvement in causes can at least appear to compromise our integrity. The business and professional ties of traditional family members or other members of your household must be disclosed to department heads.

Fairness

Reporters and editors of The Post are committed to fairness. While arguments about objectivity are endless, the concept of fairness is something that editors and reporters can easily understand and pursue. Fairness results from a few simple practices: No story is fair if it omits facts of major importance or significance. Fairness includes completeness.

No story is fair if it includes essentially irrelevant information at the expense of significant facts. Fairness includes relevance.

No story is fair if it consciously or unconsciously misleads or even deceives the reader. Fairness includes honesty–leveling with the reader.

No story is fair if it covers individuals or organizations that have not been given the opportunity to address assertions or claims about them made by others. Fairness includes diligently seeking comment and taking that comment genuinely into account.

Taste

The Washington Post respects taste and decency, understanding that society’s concepts of taste and decency are constantly changing. A word offensive to the last generation can be part of the next generation’s common vocabulary. But we shall avoid prurience. We shall avoid profanities and obscenities unless their use is so essential to a story of significance that its meaning is lost without them. In no case shall obscenities be used without the approval of the executive or managing editors.

If editors decide that content containing potentially offensive material has a legitimate news value, editors should use visual and/or text warnings about such material. For example, we may link to a Web page that contains material that does not meet standards for Post original content, but we let users know what they might see before they click the link by including a warning, such as “Warning: Some images on this site contain graphic images of war.”

Finally, we do not link to sites that aid or abet illegal activity. Consult with the Legal Department if you have a question about whether a site falls under this rule.  

Opinion

The separation of news columns from the editorial pages is solemn and complete. This separation is intended to serve the reader, who is entitled to the facts in the news columns and to opinions on the editorial and “op-ed” pages. But nothing in this separation of functions is intended to eliminate from the news columns honest, in-depth reporting, or analysis or commentary when plainly labeled.  The labels are designed as follows:

Analysis: Interpretation of the news based on evidence, including data, as well as anticipating how events might unfold based on past events

Perspective: Discussion of news topics with a point of view, including narratives by individuals regarding their own experiences.

Opinion: A column or blog in the Opinions section.

Review: A professional critic’s assessment of a service, product, performance, or artistic or literary work.

Social Media

When using networks such as Facebook, Twitter etc., for reporting or for our personal lives, we must protect our professional integrity and remember: Washington Post journalists are always Washington Post journalists.

Social-media accounts maintained by Washington Post journalists reflect upon the reputation and credibility of the newsroom. Even as we express ourselves in more personal and informal ways to forge better connections with our readers, we must be ever mindful of preserving the reputation of The Washington Post for journalistic excellence, fairness and independence. Every comment or link we share should be considered public information, regardless of privacy settings.

Post journalists must refrain from writing, tweeting or posting anything – including photographs or video – that could objectively be perceived as reflecting political, racial, sexist, religious or other bias or favoritism.

The National and Community Interest

The Washington Post is vitally concerned with the national interest and with the community interest. We believe these interests are best served by the widest possible dissemination of information. The claim of national interest by a federal official does not automatically equate with the national interest. The claim of community interest by a local official does not automatically equate with the community interest.

A Journalist’s Role

Although it has become increasingly difficult in an Internet age, reporters should make every effort to remain in the audience, to be the stagehand rather than the star, to report the news, not to make the news.

In gathering news, journalists will not misrepresent their identity or their occupation. They will not portray themselves as police officers, physicians or anything other than journalists.

Verification and fact-checking standards

Washington Post reporters have primary responsibility for reporting, writing, and fact-checking their stories. Stories are subject to review by one or more editors. The Post has a multi-level structure for the review and editing of stories that may include fact-checking. These include assignment editors (department heads, their deputy editors and assistant editors) who collaborate with reporters on the origination of stories and typically provide initial review when a story is submitted by a reporter; multiplatform editors (also called copy editors) who often provide initial review on breaking news stories and routinely provide second-level review on print and other less time-sensitive stories; and senior editors who have overall oversight of the daily and weekend report for digital publication throughout the day as well as The Post’s print editions. Editors who oversee digital platforms also may be involved in the presentation of stories as well as headlines, news alerts and newsletters. The number of editors who review a story prior to publication and the extent of their involvement varies depending on a range of factors, including complexity, sensitivity, and the pressure of time.

Diversity policy

Diversity is at the core of Washington Post journalism. Accurately reporting stories from the United States and around the world means engaging a variety of voices as interviewees and first-person writers, striving for a staff that reflects a range of backgrounds and life experiences, and seeking feedback from all who would give it.

We submit data on staff diversity to the American Society of News Editors. You can find that here.

Ownership structure

The Washington Post is owned by Jeffrey P. Bezos, the founder and chief executive of Amazon.com. The Post was founded in 1877.

Corrections policy

Policy

The Washington Post strives for a nimble, accurate and complete news report. We endeavor to be promptly responsive in correcting errors in material published on digital platforms and in print. When we run a correction, clarification or editor’s note, our goal is to tell readers, as clearly and quickly as possible, what was wrong and what is correct. Anyone should be able to understand how and why a mistake has been corrected.

Updating a digital report

Our individual pieces of journalism evolve as we sharpen and improve them. Our readers expect that from us in the digital age. It is unnecessary to put notes on stories stating that a story has been updated unless there is a particular reason to note the addition of new information or other change; the time stamp signals to readers that they are reading a developing story. It is necessary to use a correction, clarification or editor’s note to inform readers whenever we correct a significant mistake.

Corrections

If we are substantively correcting an article, photo caption, headline, graphic, video or other material, we should promptly publish a correction explaining the change.

Clarification

When our journalism is factually correct but the language we used to explain those facts is not as clear or detailed as it should be, the language should be rewritten and a clarification added to the story. A clarification can also be used to note that we initially failed to seek a comment or response that has since been added to the story or that new reporting has shifted our account of an event.

Editor’s Notes
A correction that calls into question the entire substance of an article, raises a significant ethical matter or addresses whether an article did not meet our standards, may require an Editor’s Note and be followed by an explanation of what is at issue. A senior editor must approve the addition of an Editor’s Note to a story.

Other Corrections Policies

  1. When an error is found by a reader and posted to the comment stream, the audience engagement team should indicate in comments that it has been corrected.
  2. If we have sent out incorrect information in an alert, we should send out an alert informing people that the news reported in the earlier alert was wrong and give readers the accurate information.
  3. When we publish erroneous information on social networks, we should correct it on that platform.
  4. We do not attribute blame to individual reporters or editors (e.g. “because of a reporting error” or “because of an editing error”). But we may note that an error was the result of a production problem or because incorrect information came to us from a trusted source (wire services, individuals quoted, etc.)

Take-down (unpublish) requests

Because of the ease with which our published content can be searched and retrieved online, even years after publication, we are increasingly being asked to take down (or “unpublish”) articles from our website.

As a matter of editorial policy, we do not grant take-down requests, which should be vetted at the highest level. If the subject claims that the story was inaccurate, we should be prepared to investigate and, if necessary, publish a correction. And there may be situations in which fairness demands an update or follow-up coverage — for example, if we reported that a person was charged with a crime but did not report that the charges were later dismissed for lack of evidence. In short, our response will be to consider whether further editorial action is warranted, but not to remove the article as though it had never been published. When we publish publicly available personal data, we only will review takedown requests if the person involved is under threat of physical harm because of the existence of the material.

Policy on sources

The Washington Post is committed to disclosing to its readers the sources of the information in its stories to the maximum possible extent. We want to make our reporting as transparent to the readers as possible so they may know how and where we got our information. Transparency is honest and fair, two values we cherish.

Confidential Sources

Sources often insist that we agree not to name them before they agree to talk with us. We must be reluctant to grant their wish. When we use an unnamed source, we are asking our readers to take an extra step to trust the credibility of the information we are providing. We must be certain in our own minds that the benefit to readers is worth the cost in credibility.

In some circumstances, we will have no choice but to grant confidentiality to sources. We recognize that there are situations in which we can give our readers better, fuller information by allowing sources to remain unnamed than if we insist on naming them. We realize that in many circumstances, sources will be unwilling to reveal to us information about corruption in their own organizations, or high-level policy disagreements, for example, if disclosing their identities could cost them their jobs or expose them to harm. Nevertheless, granting anonymity to a source should not be done casually or automatically.

Named sources are vastly to be preferred to unnamed sources. Reporters should press to have sources go on the record. We have learned over the years that persistently pushing sources to identify themselves actually works—not always, of course, but more often than many reporters initially expect. If a particular source refuses to allow us to identify him or her, the reporter should consider seeking the information elsewhere.

Editors have an obligation to know the identity of unnamed sources used in a story, so that editors and reporters can jointly assess the appropriateness of using them. Some sources may insist that a reporter not reveal their identity to her editors; we should resist this. When it happens, the reporter should make clear that information so obtained cannot be published. The source of anything that is published will be known to at least one editor.

We prefer at least two sources for factual information in Post stories that depends on confidential informants, and those sources should be independent of each other. We prefer sources with firsthand or direct knowledge of the information. A relevant document can sometimes serve as a second source. There are situations in which we will publish information from a single source, but we should only do so after deliberations involving the executive editor, the managing editor and the appropriate department head. The judgment to use a single source depends on the source’s reliability and the basis for the source’s information.

We must strive to tell our readers as much as we can about why our unnamed sources deserve our confidence. Our obligation is to serve readers, not sources. This means avoiding attributions to “sources” or “informed sources.” Instead we should try to give the reader something more, such as “sources familiar with the thinking of defense lawyers in the case,” or “sources whose work brings them into contact with the county executive,” or “sources on the governor’s staff who disagree with his policy.”

Dealing With Sources

We strive to treat sources fairly. This means putting statements we quote into context, and summarizing the arguments of people we quote in ways that are recognizably fair and accurate. Potentially controversial statements by public figures and others should be quoted in a complete sentence or paragraph when possible, and in context. In some cases, this will mean making clear what question was being answered when the statement was made.

When seeking comment from persons who are the subject of a story, we should give them a reasonable opportunity to respond to us. This means not calling at the last minute before deadline if we have any choice about timing.

We do not promise sources that we will refrain from additional reporting or efforts to verify the information they may give us.

We should not publish ad hominem quotations from unnamed sources. Sources who want to take a shot at someone should do so in their own names.

We should avoid blind quotations whose only purpose is to add color to a story.

We do not use pseudonyms, and we do not mislead our readers about the identities of people who appear in our stories. In the rare situations when we decide to identify someone by other than their full name, we do so in a straightforward manner—by using a first name only, for example. Editors must participate in decisions to provide less than a full name, and we must explain to readers why we are not using full names.

We do not fool or mislead sources. When identifying ourselves, we say we are reporters for The Post. Our reporting should be honorable; we should be prepared to explain publicly anything we do to get a story.

Attribution

We must be truthful about the source of our information. Facts and quotations in a story that were not produced by our own reporting must be attributed. Attribution of material from other media must be total. Plagiarism is not permitted. It is the policy of this newspaper to give credit to other publications that develop exclusive stories worthy of coverage by The Post.

Readers should be able to distinguish between what the reporter saw and what the reporter obtained from other sources such as wire services, pool reporters, e-mail, websites, etc.

We place a premium value on original reporting. We expect Washington Post reporters to see as much as they can of the story they are reporting, and to talk to as many participants as possible. Reporters should consider the advantages of reporting from the scene of events they are covering whenever that is possible.

If a reporter was not present at a scene described in a story, the story should make that clear. Assertions that something actually happened although it was unseen by the reporter should be attributed, so the narrative device of describing an event as it was recounted to us by witnesses must include attribution. If we reconstruct statements or exchanges between people based on the recollections of those people or witnesses who heard them speak, we must attribute those recollections transparently. If you are unsure about the application of these guidelines in a particular situation, discuss it with your editors.

In some circumstances where a source has allowed us to see something that reporters would not otherwise be able to observe, special problems of attribution may arise. They should always be discussed with editors.

Any significant reporting by a stringer, staff member, or other Post employee should be credited in a byline or a tagline at the end of a story. When such people take notes from broadcasts of news events on radio or television, conduct basic research or check routine facts, they need not be credited.

Ground Rules

Journalistic ground rules can be confusing, but our goal is clarity in our dealings with sources and readers. This means explaining our ground rules to sources, and giving readers as much information as possible about how we learned the information in our stories. If a source is not on the record, it is important to establish ground rules at the beginning of a conversation. In a taped interview, it is preferable for the discussion of ground rules to be on the tape. We strongly prefer on-the-record interviews to all other types, but we recognize that getting sources on the record is not always possible. When it is not, we owe readers explanations as to why not, as discussed above.

We should start virtually all interviews with the presumption that they are on the record. Inexperienced sources—usually ordinary people who unexpectedly find themselves the news—should clearly understand that you are a reporter and should not be surprised to find themselves quoted in the newspaper.

In establishing ground rules, the following are The Washington Post’s definitions of various forms of attribution. People use these terms to mean different things, so if your dealings with a source are going to be anything other than “on the record”, you should have a discussion to clarify the terms before you begin an interview.

On the record: For quotation, attributable to the source by name.

On background, or not for attribution: These both mean the same thing: information that can be attributed to “a police department official” or “a player on the team” who is not named. We must be careful, when dealing with sources who say they want to provide information “on background,” to explain that to us that means we can quote the statement while maintaining the confidentiality of the source. Some sources will try to negotiate the terms of art in “background” attribution—for example, a State Department official may ask to be identified as “an administration official.” We should try to put the reader’s interest first. In a story about a fight between the Pentagon and the State Department, for example, quoting “an administration official” is useless to readers. Use good judgment, and press for maximum revelation in attribution.

Deep background: This is a tricky category, to be avoided if possible. Information accepted on “deep background” can be included in the story, but not attributed. That means there is no way to help readers understand where it is coming from, which is why we discourage the use of deep background. You can also use information received on deep background as the basis for further reporting.

Off the record: This is the trickiest of all, because so many people misuse the term. By the traditional definition, off-the-record information cannot be used for publication or in further reporting. But many sources, including some sophisticated officials, use the term when they really mean “not for attribution to me.” We must be very careful when dealing with sources who say they want to be “off the record.” If they mean “not for attribution to me,” we need to explain the difference, and discuss what the attribution will actually be. If they really mean off the record as the term is traditionally defined, then in most circumstances, we should avoid listening to such information at all. We do not want to be hamstrung by a source who tells us something that becomes unusable because it is provided on an off-the-record basis.

A source may be willing to give us information for our guidance or to prompt further reporting, on the understanding that we will not use his or her comments as the basis for publication.

Quoting Sources and Sharing Information

Our objective in quoting people is to capture both their words and intended meaning accurately. That requires care in negotiating ground rules with sources. We do not allow sources to change the rules governing specific quotations after the fact. Once a quote is on the record, it remains there.

Sometimes, a source will agree to be interviewed only if we promise to read quotations back to the source before publication. We should not allow sources to change what was said in an original interview, although accuracy or the risk of losing an on-the-record quote from a crucial source may sometimes require it. A better and more acceptable alternative is to permit a source to add to a quotation and then explain that sequence to readers. If you find yourself in this gray area, consult with your editor.

Some reporters share sections of stories with sources before publication, to ensure accuracy on technical points or to catch errors. A science writer, for instance, may read to a source a passage, or even much of a story, about a complex subject to make sure that it is accurate. But it is against our policy to share drafts of entire stories with outside sources prior to publication, except with the permission–which will be granted extremely rarely–of the Executive or Managing Editors.

In negotiating terms of engagement with a source, reporters and editors should be prepared for everything they say or write, in any medium, on the telephone or in person, to become public. They should make no promises, agree to no compromises and offer no concessions that aren’t compatible with this policy and The Post’s standards. Clarity and straightforwardness in our communications with sources is essential.

Expert Sources

We quote a lot of people in The Post. We’re always interviewing men and women on the street, and we seem to depend ever more on “experts” to provide context for stories, make interpretive points or offer judgments about subjects we are covering. This is a healthy trend. But it is important to think about who we are quoting, either for citizen reaction or for expert guidance.

We must strive always to get a rich variety of voices into our work. This means avoiding dependence on the same academics or public figures for reactions to stories. We all must look for new specialists—especially women, younger people, people of color, unconventional thinkers and people who aren’t routinely quoted by us and other media outlets, but who constitute a large part of our readership, and of the general population. This won’t happen unless we make an effort. Reporters need to expand their universe of sources.

Similarly, we need to remember to talk to a broad range of individuals who are affected by the events we cover. When we write about a new school board policy, we should talk to students, teachers and parents about its impact. When we cover a company’s sale or move, we should hear from affected employees. The voices of ordinary citizens of all ages should be a regular part of our journalism—more than they have been in the past.

Reader engagement and feedback

Today’s readers are engaged readers. The Post welcomes reader contributions; their scrutiny often improves our journalism.

Nearly all of our articles offer comment sections for readers to discuss the news of the day or provide comments on our work. Readers can also interact with Post reporters during our live chats.

Reporters and editors are encouraged to participate in comments, but the extent of their participation, if any, depends on their preferences. We ask commenters to keep their tone civil and observe the rules outlined in our Discussion and Submission Guidelines; readers and staff may report offensive comments to our 24/7 moderation team. We provide an opportunity for those mentioned in our reporting to have their comments highlighted.

Other ways to contact us:

  • Our online Help Center directs questions about The Post to the appropriate departments for resolution. The Help Center is also available by phone at 1-800-477-4679.
  • Letters to the editor can be sent to letters@washpost.com or to Letters to the Editor, The Washington Post, 1301 K Street NW, Washington D.C. 20071. Letters selected by editors will appear online and in our print edition.
  • Op-ed pieces can be submitted using this form. Op-eds selected by editors will appear in our digital and print editions.
  • Questions can be emailed to the reader representative at readers@washpost.com.
#####EOF##### Pentagon tells U.S. military bases to stop selling ZTE, Huawei phones - The Washington Post

Pentagon tells U.S. military bases to stop selling ZTE, Huawei phones


This 2012 file photo shows staff and visitors walk pass the lobby at the telecommunications equipment firm Huawei Technologies in Wuhan, central China's Hubei province.  (AFP/Getty Images)

U.S. service members will no longer be able to purchase ZTE and Huawei phones on military bases, according to a new Defense Department directive that cites security risks posed by the devices.

“Huawei and ZTE devices may pose an unacceptable risk to Department's personnel, information and mission,” Pentagon spokesman Major Dave Eastburn said in a statement. “In light of this information, it was not prudent for the Department's exchanges to continue selling them to DoD personnel.”

The Pentagon declined to provide the technical details of potential threats.

The order to halt the sale of Huawei and ZTE phones and remove them from the military exchanges was given last Friday, the Pentagon said. Mobile Internet modems and other wireless products are also included in the ban. The order was reported earlier by Stars and Stripes and the Wall Street Journal Wednesday.

The order doesn't prevent service members from using the devices outright or from bringing them to work. But the Pentagon said that, “Service members should be mindful of the security risks posed by the use of Huawei devices, regardless of where they were purchased.”

The decision is the latest move by the Trump administration to limit the influence of Chinese wireless equipment manufacturers, stemming from fears that a more dominant Chinese tech presence could make it easier for Beijing to hack or spy on American businesses and military personnel.

ZTE and Huawei did not immediately respond to requests for comment.

The ban follows a highly unusual move earlier this year, when President Trump ordered Singapore-based Broadcom to abandon its $117 billion hostile bid for Qualcomm, blocking what would have been one of the largest technology deals in history. Trump cited “credible evidence” in his presidential order that the takeover threatened “to impair the national security of the United States.”

The Federal Communications Commission has also taken steps to ban federal funds from being spent on wireless equipment made by companies that pose a national security threat to U.S. communication networks. Both ZTE and Huawei were mentioned in the FCC’s proposal in a section detailing the federal government's concerns with foreign tech providers.

Read more:

Opinion: America is hanging up on China's telecom industry

The FCC wants to slap restrictions on some Chinese-made wireless gear

Market Watch
Dow 26,181.59
Today 0.29%
S&P 2,867.63
Today 0.02%
NASDAQ 7,846.74
Today 0.23%
Last Updated:3:00 PM 04/04/2019
#####EOF##### Your WiFi-connected thermostat can take down the whole Internet. We need new regulations. - The Washington Post

Your WiFi-connected thermostat can take down the whole Internet. We need new regulations.

The government has to get involved in the “Internet of Things."


(iStock)
Bruce Schneier is a security technologist and a lecturer at the Kennedy School of Government at Harvard University. His new book, "Click Here to Kill Everybody," will be published in September.

Late last month, popular websites like Twitter, Pinterest, Reddit and PayPal went down for most of a day. The distributed denial-of-service attack that caused the outages, and the vulnerabilities that made the attack possible, was as much a failure of market and policy as it was of technology. If we want to secure our increasingly computerized and connected world, we need more government involvement in the security of the “Internet of Things” and increased regulation of what are now critical and life-threatening technologies. It’s no longer a question of if, it’s a question of when.

First, the facts. Those websites went down because their domain name provider — a  company named Dyn — was forced offline. We don’t know who perpetrated that attack, but it could have easily been a lone hacker. Whoever it was launched a distributed denial-of-service attack against Dyn by exploiting a vulnerability in large numbers — possibly millions — of Internet-of-Things devices like webcams and digital video recorders, then recruiting them all into a single botnet. The botnet bombarded Dyn with traffic, so much that it went down. And when it went down, so did dozens of websites.

Your security on the Internet depends on the security of millions of Internet-enabled devices, designed and sold by companies you’ve never heard of to consumers who don’t care about your security.

The technical reason these devices are insecure is complicated, but there is a market failure at work. The Internet of Things is bringing computerization and connectivity to many tens of millions of devices worldwide. These devices will affect every aspect of our lives, because they’re things like cars, home appliances, thermostats, lightbulbs, fitness trackers, medical devices, smart streetlights and sidewalk squares. Many of these devices are low-cost, designed and built offshore, then rebranded and resold. The teams building these devices don’t have the security expertise we’ve come to expect from the major computer and smartphone manufacturers, simply because the market won’t stand for the additional costs that would require. These devices don’t get security updates like our more expensive computers, and many don’t even have a way to be patched. And, unlike our computers and phones, they stay around for years and decades.

An additional market failure illustrated by the Dyn attack is that neither the seller nor the buyer of those devices cares about fixing the vulnerability. The owners of those devices don’t care. They wanted a webcam — or thermostat, or refrigerator — with nice features at a good price. Even after they were recruited into this botnet, they still work fine — you can’t even tell they were used in the attack. The sellers of those devices don’t care: They’ve already moved on to selling newer and better models. There is no market solution because the insecurity primarily affects other people. It’s a form of invisible pollution.

And, like pollution, the only solution is to regulate. The government could impose minimum security standards on IoT manufacturers, forcing them to make their devices secure even though their customers don’t care. They could impose liabilities on manufacturers, allowing companies like Dyn to sue them if their devices are used in DDoS attacks. The details would need to be carefully scoped, but either of these options would raise the cost of insecurity and give companies incentives to spend money making their devices secure.

It’s true that this is a domestic solution to an international problem and that there’s no U.S. regulation that will affect, say, an Asian-made product sold in South America, even though that product could still be used to take down U.S. websites. But the main costs in making software come from development. If the United States and perhaps a few other major markets implement strong Internet-security regulations on IoT devices, manufacturers will be forced to upgrade their security if they want to sell to those markets. And any improvements they make in their software will be available in their products wherever they are sold, simply because it makes no sense to maintain two different versions of the software. This is truly an area where the actions of a few countries can drive worldwide change.

Regardless of what you think about regulation vs. market solutions, I believe there is no choice. Governments will get involved in the IoT, because the risks are too great and the stakes are too high. Computers are now able to affect our world in a direct and physical manner.

Security researchers have demonstrated the ability to remotely take control of Internet-enabled cars. They’ve demonstrated ransomware against home thermostats and exposed vulnerabilities in implanted medical devices. They’ve hacked voting machines and power plants. In one recent paper, researchers showed how a vulnerability in smart lightbulbs could be used to start a chain reaction, resulting in them all being controlled by the attackers — that’s every one in a city. Security flaws in these things could mean people dying and property being destroyed.

Nothing motivates the U.S. government like fear. Remember 2001? A small-government Republican president created the Department of Homeland Security in the wake of the Sept. 11 terrorist attacks: a rushed and ill-thought-out decision that we’ve been trying to fix for more than a decade. A fatal IoT disaster will similarly spur our government into action, and it’s unlikely to be well-considered and thoughtful action. Our choice isn’t between government involvement and no government involvement. Our choice is between smarter government involvement and stupider government involvement. We have to start thinking about this now. Regulations are necessary, important and complex — and they’re coming. We can’t afford to ignore these issues until it’s too late.

In general, the software market demands that products be fast and cheap and that security be a secondary consideration. That was okay when software didn’t matter — it was okay that your spreadsheet crashed once in a while. But a software bug that literally crashes your car is another thing altogether. The security vulnerabilities in the Internet of Things are deep and pervasive, and they won’t get fixed if the market is left to sort it out for itself. We need to proactively discuss good regulatory solutions; otherwise, a disaster will impose bad ones on us.

Read more:

By November, Russian hackers could target voting machines

Your iPhone just got less secure. Blame the FBI.

Hackers don’t want to crash stock exchanges. They want to make money off them.

Most Read Opinions
#####EOF##### Trump invokes new demand for extracting billions of dollars from U.S. allies - The Washington Post

Trump invokes new demand for extracting billions of dollars from U.S. allies


U.S. military helicopters at Camp Humphreys in South Korea last week. (-/AFP/Getty Images)

In private discussions with his aides, President Trump has devised an eye-popping formula to address one of his long-standing complaints: that allies hosting U.S. forces don’t pay Washington enough money.

Under the formula, countries would pay the full cost of stationing American troops on their territory, plus 50 percent more, said U.S. and foreign officials familiar with the idea, which could have allies contributing five times what they provide.

Trump calls the formula “cost plus 50,” and it has struck fear in the hearts of U.S. allies who view it as extortionate.

Rumors that the formula could become a global standard have especially rattled Germany, Japan and South Korea, which host thousands of forces, and U.S. officials have mentioned the demand to at least one country in a formal negotiation setting, said people familiar with the matter.

National Security Council spokesman Garrett Marquis said the Trump administration “is committed to getting the best deal for the American people” but would not comment “on any ongoing deliberations regarding specific ideas.”

Trump has long complained that U.S. and NATO allies freeload on U.S. military protection, but the cost-plus-50 formula has only gained traction in recent months, said current and former U.S. officials, who like others spoke on the condition of anonymity to discuss sensitive negotiations.

It is not a formal proposal or policy but serves as a kind of “maximum billing” option designed in part to draw attention to an issue that speaks to Trump’s demand that allies shoulder more of the burden of their own defense, a senior administration official said.

One of the first U.S. allies to confront the Trump administration’s hardball tactics was South Korea, which last month agreed to pay $925 million for hosting 28,500 American troops. That was an 8.2 percent increase from the previous year’s payment and about half the total costs. South Korean officials preferred a five-year agreement, but the deal covers only one, meaning they could face pressure to meet Trump’s cost-plus-50 demand next year.

A U.S. military official said U.S. Forces Korea had been “sweating” the signing of a new agreement for months.

There are numerous burden-sharing ideas floating around, and Trump has not settled on any one, officials said.

Burden-sharing debate

Although it may be a red herring, the phrase “cost plus 50” has appeared on informal lists of options, one official said. But it is not clear what Trump advisers mean by “cost,” whether it’s the entire budget to run a base and pay U.S. armed forces or some part of that.

U.S. allies hosting permanent American military installations pay for a portion of costs in various ways. Japan and South Korea make cash contributions, while Germany supports the U.S. troop presence through in-kind contributions such as land, infrastructure and construction, in addition to foregone customs duties and taxes.

Trump has called that “in-kind” contribution insufficient, a senior U.S. diplomat said.

For decades, leading foreign policy figures in both parties have urged U.S. allies to take on greater responsibility for their security, but even staunch advocates of burden-sharing have questioned Trump’s approach.

“Trump is correct in wanting U.S. allies to bear more responsibility for collective defense, but demanding protection money from them is the wrong way to do it,” said Stephen Walt, a scholar of international relations at Harvard University. “Our armed forces are not mercenaries, and we shouldn’t send U.S. troops into harm’s way just because another country is paying us.”

The cost-plus-50 idea would probably not be presented as a blanket demand to all allies, even if Trump ended up signing off on it, several people familiar with elements of the discussion said. Many of his top aides oppose the formula and have succeeded in the past in bringing him down from the maximalist approach, the people said.

The existence of Trump’s formula was first reported by Bloomberg News.

Critics of U.S. bases around the world say the bases are costly, stoke tensions with adversaries and have unintended consequences. The Pentagon counters that its 54,000 troops in Japan and presence in South Korea allow it to project power and deter North Korea and China.

In Germany, where the Pentagon has more than 33,000 troops, the U.S. Army announced last year that it could add 1,500 more by 2020 in “a display of our continued commitment to NATO and our collective resolve to support European security.”

An 'inaccurate explanation'

Trump’s idea has been rumored in European capitals for months, though senior European diplomats said they knew of no formal presentations or threat from the White House. Such a proposal appears aimed principally at Germany, the subject of frequent Trump complaints about NATO defense spending and what he says is an unfair German reliance on American forces for its defense.

Trump does not accept the argument that U.S. forces in Germany are a strategic asset for the United States and maybe an overall cost savings because they help facilitate U.S. military actions in the Middle East and Africa as well as across the European continent, former U.S. officials said.

That disconnect predates the discussion of billing Germany for the cost of basing forces there, and some former advisers had hoped they could steer Trump toward a wider view of what the United States gains from the arrangement. American lives that might have otherwise been lost on the battlefields of Afghanistan, Iraq and elsewhere, for example, are often saved at Landstuhl military hospital in Germany.

“When he says, ‘Thirty thousand American forces are there protecting Germany,’ that is a completely inaccurate explanation of what American forces in Germany are there for,” retired Lt. Gen. Ben Hodges III, said in an interview in the fall as Trump’s rhetoric on the issue heated up. Hodges was addressing the president’s complaints about the number of U.S. forces in Germany — more than 30,000 — and threats to downsize or relocate forces, not the specific idea of billing Germany.

The benefit to the United States can’t be measured in the transactional ways Trump frames it, said Hodges, who served as commanding general of the U.S. Army in Europe. “Like with our base in Ramstein, this is a platform for power projections in the Middle East, Africa, Russia.”

Emma Ashford, a scholar at the libertarian Cato Institute, agrees with Trump that the U.S. military is overextended but said his latest gambit is the wrong tactic.

“The solution to America’s unbalanced commitment to rich allies is to gradually shift the burden to them and remove the troops,” she said. “Not to keep American troops there and charge for them like they’re mercenaries.”

The discussion comes as allies prepare for the annual summer summit, where Trump has twice berated German Chancellor Angela Merkel over her country’s defense contributions. Trump routinely misstates the NATO funding arrangement and defense spending targets, but Germany acknowledges that it has not met the threshold goal of spending 2 percent of gross domestic product on defense.

Trump could undermine the effort to increase European NATO defense spending if he starts demanding bilateral payments, said Jeffrey Rathke, president of the American Institute for Contemporary German Studies at Johns Hopkins University.

“The United States, including under the Trump administration, has had a lot of success in persuading Germany and other NATO allies that they need to contribute more to their own defense,” Rathke said. “That is possible because the spending is directed at a common NATO objective, and that is collective defense,” which is more politically palatable in Western Europe.

#####EOF#####